Legit Reviews: NVIDIA Kepler versus Fermi in Adobe After Effects CS6

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
After Effects CS6 version 11.0.2.12 finally supports the NVIDA GeForce GTX 690 card for Ray-traced 3D rendering.

after-effects-benchmark.jpg


" Our test results show that the NVIDIA GeForce GTX 580 'Fermi' video card was about 18% faster than the NVIDA GeForce GTX 680 video card! This result might shock some, but not to others. Kepler is certainly faster when it comes to gaming, but when it comes to raw compute performance the clear leader is still Fermi! A pair of NVIDIA GeForce GTX 570 video cards running in SLI finished the render in just 48 minutes, which is impressive considering NVIDIA's flagship GeForce GTX 690 video card completes the same task in 58 minutes!

Are you wondering how long it took to do the render off just the Intel Core i7 2700K processor at 5GHz in just "classic 3D" mode? It took only 7 minutes and 36 seconds, but was about 75% the quality of the ray traced version! Running Ray-traced 3D on the NVIDIA graphics card was the way to go when it came to quality, but that is another story for a different day!"
Source

Right now a GTX570 + OC gives similar level of performance to a GTX690. Assuming the Titan fixes Kepler's compute performance, it might be the ticket for those who play games and do other things with their GPU but can't afford K20/K20X cards.
 
Last edited:

MisterMac

Senior member
Sep 16, 2011
777
0
0
Kinda shows they're strategy to a full extent.

And helps the market segmentation further - and not have alotta people buy 580's for compute work.



Mo money for nvidia.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
I actually really like how Nvidia seperated the focuses of their two product lines. I would actually like to see a more pure gaming oriented card from both vendors and let those who need the number crunching grunt spend more money for it.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,989
620
126
I actually really like how Nvidia seperated the focuses of their two product lines. I would actually like to see a more pure gaming oriented card from both vendors and let those who need the number crunching grunt spend more money for it.
The compute power of the GPU can also be used for gaming. Not to mention it's high time that the GPU is used more and more for general purpose applications, it's such a waste to have your GPU sit there and do next to nothing when you're not gaming.
 

thilanliyan

Lifer
Jun 21, 2005
12,013
2,234
126
I actually really like how Nvidia seperated the focuses of their two product lines. I would actually like to see a more pure gaming oriented card from both vendors and let those who need the number crunching grunt spend more money for it.

No thanks!! Im glad AMD allows me to have both at a reasonable price, especially for BTC mining.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,989
620
126
I can't understand why anyone would want to have to pay thousands for features that previously were in a consumer grade part like Fermi. :confused:
 

SPBHM

Diamond Member
Sep 12, 2012
5,065
418
126
for this comparison GK104 vs gf114/104 would be more interesting I guess,

people looking for this kind stuff already know they should avoid GK104, it's a gaming GPU.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
for this comparison GK104 vs gf114/104 would be more interesting I guess,

people looking for this kind stuff already know they should avoid GK104, it's a gaming GPU.

Exactly, one can only ask why they avoided the Fermis below 570. We might see the real successor next month.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I actually really like how Nvidia seperated the focuses of their two product lines. I would actually like to see a more pure gaming oriented card from both vendors and let those who need the number crunching grunt spend more money for it.

Same here. I called for it for a while and nvidia delivered.
The focus on gaming means that unlike fermi, the 6XX series is not a furnace. It gets better gaming performance at lower power consumption and is quieter.

The issue with the claim of "compute can be used for gaming" is that the term "compute focus" is misleading. Its single (FP32) vs double precision (FP64) focus.
Games simply don't need larger floats then FP32 (they need more of FP32 performance though).
Your science calculation could be fine with FP32, require FP64, or even require something other then those two (such as quad precision)...
 
Last edited:

Rezist

Senior member
Jun 20, 2009
726
0
71
I like that they split them as well, but i never guessed that they would leave the price of a 300mm gpu the same as a 500mm gpu. Might as well have not cut the compute stuff out if it ends up all costing the same in the end.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
for this comparison GK104 vs gf114/104 would be more interesting I guess,

people looking for this kind stuff already know they should avoid GK104, it's a gaming GPU.

Except GF114 was a $250 part on release, not $500. Perf/$ the comparison of 580 to 680 is valid.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
I just like that the 680 plays my games better and doesn't sound like a Harrier jet about to take off.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Not everyone uses their cards for gaming, or just gaming. There are other performance aspects that matter. If you were using AE you'd care about these things.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I like that they split them as well, but i never guessed that they would leave the price of a 300mm gpu the same as a 500mm gpu. Might as well have not cut the compute stuff out if it ends up all costing the same in the end.

It's not going to cost the same though. NV might charge $899 for a 550mm2 Titan chip that theoretically is a true successor to the GTX480/580 generation.

Nov 2006 = 484mm2 8800GTX $599
June 2008 = 576mm2 GTX280 $649 (1 month later dropped to $499) = + 63% perf.
Nov 2010 = 520mm2 GTX580 $499 = + 73% perf.
http://www.computerbase.de/artikel/grafikkarten/2011/bericht-grafikkarten-evolution/3/

March 2012 = 294mm2 GTX680 $499 = +35% perf.
Feb-March 2013 = ~ 550mm2 Titan $899 = +50-60% perf.

What's wrong with that picture? NV wants to suddenly raise the price of a flagship GPU from $499-649 for a similar die size and increase in performance we generally got. JHH has been drinking too much Apple Kool-aid. :p

I am not sure the people who have been waiting to upgrade from a GTX480/570/580 who use their GPU for tasks outside of gaming are going to be thrilled about the direction NV is going if GK110 doesn't at least trickle down to a $499 level as GTX770 or something.
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
The same reasons, AMD makes a gaming and workstation model of the same card. The W9000 to be precise. Why would anyone pay 3000 for it? It's a empty argument. GK104 Kepler still does many compute tasks much faster than Fermi, but not all.

http://www.3dworldmag.com/2012/12/13/nvidia-quadro-k5000/
The Quadro K5000 offers a big leap in modelling performance, but James Morris finds that it’s not a clear winner in every area
So the K5000 beats the W9000 across the board, and by significant quantities in the all-important 3D modelling viewsets lightwave-01 and maya-03. We ran the same Bunkspeed CUDA-enhanced rendering test as we did for Boston’s Tesla-powered Venom 2300-7T. The test scene took 154 seconds with the CPU alone – almost the same as the Venom – which fell to 106 seconds with the K5000 helping out. However, the Venom took 72 seconds with the Tesla and Quadro 4000, implying that the K5000’s modelling abilities aren’t quite as stunning for CUDA-powered rendering

As a hobby or learning tool, people do use gaming cards, but it's almost never the driving reason to buy a gaming card. It's a added benefit, and not as important that a job take 20% slower or faster if you only do a few jobs a week/month/year.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
I understand their methodology if I don't agree fully.They have a "Quadro" lineup for the professional segment and the number of consumers use their card beside gaming is minimal at best.Beside optimizing the apps for both consumer and pro cards will raise the price of the consumer cards significantly.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
AMD is irrelevant in this thread.

We do have people ask what's the best consumer card to get for different apps. Especially Adobe apps. This just lets people know that a GTX-580 would be superior to a GTX-680. Not something that is typically the case and would be easy to make a mistake on.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Not everyone uses their cards for gaming, or just gaming. There are other performance aspects that matter. If you were using AE you'd care about these things.

Very expensive software. More than the cost of a dedicated rig. I guess if you need to do both on one rig get Fermi or AMD. Kind of a rare use case for nVidia to have to worry about considering the vast majority of purchasers of its gaming cards do so to play games.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
AMD is irrelevant in this thread.

We do have people ask what's the best consumer card to get for different apps. Especially Adobe apps. This just lets people know that a GTX-580 would be superior to a GTX-680. Not something that is typically the case and would be easy to make a mistake on.

Sorry you are right. Many people wouldn't know. I just thought people were complaining about the strategy nVidia took in producing the 680.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
AMD is irrelevant in this thread.

We do have people ask what's the best consumer card to get for different apps. Especially Adobe apps. This just lets people know that a GTX-580 would be superior to a GTX-680. Not something that is typically the case and would be easy to make a mistake on.

IMO, it is relevant, that is why I brought it to the discussion. And the answer to your proposed member question would have to be answered with a lot more depth, to be of any value to the questioner.
images
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The same reasons, AMD makes a gaming and workstation model of the same card. The W9000 to be precise. Why would anyone pay 3000 for it? It's a empty argument. GK104 Kepler still does many compute tasks much faster than Fermi, but not all.

http://www.3dworldmag.com/2012/12/13/nvidia-quadro-k5000/

It's not an empty argument. NV removed features but kept the price at $499. We are not talking about hardcore professional apps, but using a GPU for consumer based apps like CS6, Adobe Premiere, etc. NV's entire strategy has been shaped by GPGPU since G80 was introduced. Every single flagship card improved on GPGPU since G80, except GTX680.

It's is not a coincidence that NV's most important metric for generational increases is double precision GFLOPs/Watt not FPS/Watt.

The entire strategy behind Maxwell is even more GPGPU/compute. Maxwell GPUs would also be integrated with the Nvidia’s project Denver which fuses general purpose ARM cores alongside the GPU core. Project Denver is basically an custom built ARMv8 64-Bit processor which would be highly beneficial for computing purposes such as workstation and server usage. It either sounds like GTX680 was a bastard-child outlier in NV's overall HPC/GPGPU strategy since G80 or NV intends to charge high prices for GPUs with full compute functionality moving forward.

We'll have to see if more games start using compute but if PS4 and Xbox 720 also use HD7000 GCN GPUs and developers use OpenCL/compute for accelerating graphical effects on consoles and then port those games to the PC, we could see compute become more relevant in games. Compute isn't just about FP32 vs. 64 but about performing certain graphical calculations more efficiently via Compute Shaders. Most people just think of compute as FP32 vs. 64 and that's incorrect. It's more than that. As GPUs are becoming more general-purpose based, the industry is moving to compute because GPUs shouldn't only be used for games. Wouldn't it be great if you could use a GPU to accelerate other programs on your PC outside of games and video encoding? Well, it won't happen unless consumer GPUs embrace compute functionality across the board.
 
Last edited:
Mar 10, 2006
11,715
2,012
126
No thanks!! Im glad AMD allows me to have both at a reasonable price, especially for BTC mining.

It's because AMD doesn't charge for features that it cost them a lot to put in that they are losing money and are on the verge of bankruptcy...

AMD is like a pretty girl that lets herself be abused and controlled by evil men because she has incredibly low self-esteem.
 

thilanliyan

Lifer
Jun 21, 2005
12,013
2,234
126
It's because AMD doesn't charge for features that it cost them a lot to put in that they are losing money and are on the verge of bankruptcy...

AMD is like a pretty girl that lets herself be abused and controlled by evil men because she has incredibly low self-esteem.

Wait wasn't AMD overcharging with the 7900 series? Oh right they are not a "premium" brand in the eyes of a lot of people so they couldn't charge more even if they wanted to...did you see the whining on this forum about prices when the 7900 cards were released?! It was ridiculous. To put it back to you, it could be that nVidia is OVERcharging by taking features out...but some people are still willing to buy their cards anyway. From what I have experienced, nVidia is a better known brand than AMD/ATI for the average person so they can charge more (like Apple can)... and it is hard to get out of that rut IMO. AMD is trying harder right now (more AAA Gaming Evolved titles) I think...time will tell if it works out.

All that said, AMD is in trouble because of their CPU division...not their GPU division. Hopefully the console wins will give them a bit of a lifeline.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I am not sure the people who have been waiting to upgrade from a GTX480/570/580 who use their GPU for tasks outside of gaming are going to be thrilled about the direction NV is going if GK110 doesn't at least trickle down to a $499 level as GTX770 or something.

I'm resigned to the fact that both Nvidia and AMD are going to be charging significantly more per mm^2 for the chips, I just hope that what you are saying happens. I mean, it has to, right? Nvidia is the king at binning and partitioning chips for different price points and there are surely going to be quite a few GK110 dies that won't make the cut as neither a K20 or a top end Geforce Titan.

I've said it before, but I hope this rumored MSRP is wrong. Three months ago I'd have thought it would be crazy to hear myself say this, but I hope the top tier 6gb GK110 Geforce card comes in at $799 - $100 less than the rumored MSRP. The 3gb model $699. Then a fused off GK110 part for $599. And maybe another (akin to the gtx560ti 448)for $519 or so. That could put a GK114 part at $449, and things can go from there.