NVIDIA GeForce GTX 780 To Be Based on GK114 GPU

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Well, they have to do something, and soon.

Usually new high end GPU's tend to look ridiculous in benchmarks until the end of their cycle. Even the gtx480 started to look like it was inadequate in some games before it was refreshed. But this generation, at 1080p just about everything runs silky smooth on hd7950/gtx670 or lower. The high end cards are almost never challenged at all unless the resolution is 1440p (or higher). There doesn't seem to be an end in sight for games that can take advantage of the extra horse power. Hopefully there are more demanding games and/or more PC-specific enhancements (vendor specific or not) coming to games next year. I'd like to see developers take a bigger advantage of scaling the graphical capabilities with hardware. Who knows, though.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I wouldn't get your hopes up too high unless Nvidia wants to be power consumption king again.

You are saying that like a TDP ~300 Watt is a bad thing?
Like decoupled from performance?

The talk is about we need more than current gen in performance.
The answer is yes!

We are far from from reality-fidelity...long way to go still.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Power consumption being higher for Nvidia means nothing, obviously. The only time it should matter is if AMD has the top card.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
I wouldn't get your hopes up too high unless Nvidia wants to be power consumption king again.

Because AMD will run on antimatter and therefore cooling your PC?

Full blown GK110 will be superior to your 670 IN SLI. You can quote me on that :cool:
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
Because AMD will run on antimatter and therefore cooling your PC?

Full blown GK110 will be superior to your 670 IN SLI. You can quote me on that :cool:

I don't have an SLI setup and nobody knows what GK110 will do exactly. As we know with the whole Bulldozer debacle, specs on paper mean nothing until we see benchmarks.


I really do hope it performs well and beats AMD. That means I get a cheap 8970 :)
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I wouldn't get your hopes up too high unless Nvidia wants to be power consumption king again.

As has been proven time and again, customers forgive power consumption if the performance is there AND the cooling solution is adequate. Node maturity + continued refinements Nvidia is making to Kepler between GK104's release and GK110 becoming a Geforce card will help alleviate some of the power draw issues that people like you and I think they may have with GK110.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
The final engineering of the 8970 and it's drivers might have to be done pro bono.:p
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
As has been proven time and again, customers forgive power consumption if the performance is there AND the cooling solution is adequate. Node maturity + continued refinements Nvidia is making to Kepler between GK104's release and GK110 becoming a Geforce card will help alleviate some of the power draw issues that people like you and I think they may have with GK110.

Only issue I have with GK110 is going to be the price. I know I have the power for it. I made some regrets with purchasing a 2x gtx680s so I had buyers remorse and sold both of them. So far this GTX 670 is proving worthy since it seems to be a surprisingly good overclocker. Ill probably just buy both and sell the one I like less. I know I will end up with both of them eventually anyways so why not just buy both, right?
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Power consumption being higher for Nvidia means nothing, obviously. The only time it should matter is if AMD has the top card.

I like how you believe this generation has anything in common with the one you're attempting to reference.

Metro 2033 already does that

Metro 2033 uses Compute Shader DoF, which aside from being an awful screen blurring effect causes a large perf hit for all cards. If you disable it and use analytical AA it's not very demanding.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Power consumption being higher for Nvidia means nothing, obviously. The only time it should matter is if AMD has the top card.

Find me one place were I promote perf/watt.

I'm all about performance and I.Q quality.

No matter the brand.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I wouldn't mind a 300W monster GPU as long as it performed accordingly, or better than anything else out there. When I was running 2x480s, there were hot and used some high end juice, but they rocked anything. So, if Nvidia wants to let loose a "GK110" consumer card with the rumored specs (2880cc, 15SMX, 384bit, up to 24GB GDDR5), then bring it on. :D
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The GTX 480 was still welcomed but, to me, the constructive nit-pick was performance/watt -- was improved upon with the 5XX and even more so with the 6XX series.

A big die Kepler is more than welcomed but would like to see more balance than the GTX 480 and GTX 470 sku's. If offered, curious how GPU Boost will play a hand with the bigger die, and if there are other efficiency improvements.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Power consumption being higher for Nvidia means nothing, obviously. The only time it should matter is if AMD has the top card.

nVidia has the top card and did take the performance single GPU crown from AMD's 7970. With the release of the ghz edition 7970's and mature drivers, to me, AMD did retake the single GPU crown -- think its great BTW. Stronger competition from both parties creates more choice/value and hopefully innovation with improved gaming experience potential for their customers.

Power and over-all efficiency matters.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
You will win that bet hands down. Never ever will a full GK110 be 90% faster than GK104. Don't know how anyone could come to such a conclusion. The only scenario where this could happen is when the 670 SLI has too little VRAM.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
0.91 x 1.70 = 1.55

And that's even without mentioning SSAA, SGSSAA, multimon., or tanking due to VRAM.
So yeah at 150% of GTX 680 fps, GK110 is superior to 155% 670 SLI.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Except that SLI scales much better than 70%. What could happen is that you run into a CPU bottleneck with both, 670 SLI and GK110.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
I wouldn't mind a 300W monster GPU as long as it performed accordingly, or better than anything else out there. When I was running 2x480s, there were hot and used some high end juice, but they rocked anything. So, if Nvidia wants to let loose a "GK110" consumer card with the rumored specs (2880cc, 15SMX, 384bit, up to 24GB GDDR5), then bring it on. :D

very much this

until the day where mutli GPU has no side effects vs. single GPU (microstutter, input lag, etc) I will always support the fastest single GPU possible
 

Ajay

Lifer
Jan 8, 2001
16,094
8,112
136
I wouldn't mind a 300W monster GPU as long as it performed accordingly, or better than anything else out there. When I was running 2x480s, there were hot and used some high end juice, but they rocked anything. So, if Nvidia wants to let loose a "GK110" consumer card with the rumored specs (2880cc, 15SMX, 384bit, up to 24GB GDDR5), then bring it on. :D

24GB of GDDR5!! I think 3/6GB would future proof this card, and then some. Or, are you planning on running six 30" monitors or something :confused:
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Except that SLI scales much better than 70%. What could happen is that you run into a CPU bottleneck with both, 670 SLI and GK110.


Multi-GPU brings along drawback eg. multistuttering (AFR) so one cannot compare those on a equal footing.
 

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
24GB of GDDR5!! I think 3/6GB would future proof this card, and then some. Or, are you planning on running six 30" monitors or something :confused:

That spec is being thrown around due to rumours about the GK110 Tesla. I doubt it will make it into the Geforce cards.