[VC]NVIDIA GeForce GTX 980, GTX 980 SLI, GTX 970, 3DMark performance

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
Yeah, but how long before we see those high end parts? Will the miserable sods drip feed them out after a years delay again? That would still equate to stagnation in the grand scheme of things.

It's quite good on a technical level, its extremely unheard of to have a situation where the same node mid-range performs similar to a top product, its normally in the realms of a node shrink.

This is no means stagnation like what we have on the CPU from Intel due to lack of AMD competition.

Here, its a case of making the most of being on 28nm for so long. Pricewise, I don't expect the top GM104 to be <$500, eventhough they could price it quite low due to a mature 28nm node with obviously very high yield on a smaller die. But they have no pressure to price it low and certainly lots of NV users are willing to pay a premium for it.
 

PC Perv

Member
Nov 6, 2009
41
0
0
Assuming ball park figures ~$400 for 980 would be a nice upgrade for GK104 crowd. Anything higher than that.. Well, I guess NV's royal customers will once again keep NV afloat. :)


Posted from Anandtech.com App for Android
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
When will the 20nm geforce cards come out? Will those be named 1070gtx/1080gtx and come out next year spring? This should be the Maxwell refresh & be 20nm yes?
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
ROFL, 256-bit memory interface, peasant card Nvidia. Where is the replacement for my 780 Ti GHz? Hits 1215MHz core in game . . . . . I have no incentive to upgrade.
 

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
Assuming ball park figures ~$400 for 980 would be a nice upgrade for GK104 crowd.
i'm not sure about that. i have a GTX680 which i could sell and upgrade to a GIGABYTE R9-290 WindForce for 370$ (http://www.amazon.com/gp/product/B00...JDPXGT0JT8C37Q) which would be a better deal IMO.
but i don't usually buy high-end at retail, i snag them used once the next generation comes out. EDIT: doh, GM204 is not high-end ;)
 
Last edited:

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
I don't feel ripped off with the 680's. They saved me from the 7970's (which really were a rip off considering how that turned out for me) and price wise for the performance improvement I felt it was reasonable at the time. The 980 is a different beast, its effectively an architecture change on the same process, so sort of a super refresh of gamer end of keplar. Its closest cousin is actually the 680/770 not the 780 that it will be compared to more often than not.

I do feel kind of ripped off. The second GTX 780 in my system has been slightly above a paperweight. Since I have gotten it I played three games: Watch Dogs, Titanfall, and Dark Souls 2.

SLI with Watch Dogs was a stuttering piece of garbage which surprised me becaue it's part of their Game Works/TWIMTBP program.

Titanfall just got SLI support like a week ago....I stopped playing that game 2 months after it came out.

Dark Souls 2 was flickering with SLI on and I beat it with SLI off, they apparently fixed the shadows flickering at a later update.

So yea, the superior nVidia drivers have been nothing more than a myth for me so far. They can still make it right by having a proper functioning driver/SLI profile for Witcher 3 when it launches but I don't think I am going to go with multi-GPUs again.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Of course they will release such a thing. And they will price it exactly right: a sweet spot at which you maximize your profit (and other considerations.) That's how you run a successful business.

Are you concerned about the buyer (value etc.) or the company (and it's profit$)? :eek:
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
You never say a LOT of things.

Launch driver. For GM204.

Just so I got the point home.

That's right, I don't. You continue to read into things stuff that isn't there.

I used to think it was a talent but I'm starting to consider that it might be paranoia.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
That's right, I don't. You continue to read into things stuff that isn't there.

I used to think it was a talent but I'm starting to consider that it might be paranoia.

Sometimes, it's wise to be.

But have I made it clearer what I meant before? I just want to make certain you understood what I meant.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
But the performance increase would be quite small. The GTX 780 was released about 1.5 years ago. One would expect a bit more than ~30% in that time frame.

Well don't tell that to AMD, you'll make them feel REALLY bad:

7870 was released TWO and a half years ago.
The 260X is their 28nm refresh, one size down.

Untitled.png
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
I thought the 270/270X was AMD's 7850/7870 rebadge? Not the 260/X
If I have missed the context, apologies.
 
Last edited:

iiiankiii

Senior member
Apr 4, 2008
759
47
91
Well don't tell that to AMD, you'll make them feel REALLY bad:

7870 was released TWO and a half years ago.
The 260X is their 28nm refresh, one size down.

Untitled.png

the 270/x are the re branded 7850/7870. Yes, AMD have been very stagnant with their mid-range cards. Nvidia is guilty of this, too. The 770/760 are essentially re branded 680/670. That's what happen when GPUs are stuck on the same node for this long. Stagnation.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
I thought the 270/270X was AMD's 7850/7870 rebadge? Not the 260/X
If I have missed the context, apologies.

Comparing GTX 780 to 980 is not comparing Big Kepler to Big Maxwell, it's comparing Big Kepler to a smaller-die chip. It's "GTX 480 to GTX 560 Ti", not, "GTX 480 to GTX 580." So the AMD analogue isn't between Pitcairn and the Pitcairn rebadge, it's between Pitcairn and the smaller Bonaire.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Comparing GTX 780 to 980 is not comparing Big Kepler to Big Maxwell, it's comparing Big Kepler to a smaller-die chip. It's "GTX 480 to GTX 560 Ti", not, "GTX 480 to GTX 580." So the AMD analogue isn't between Pitcairn and the Pitcairn rebadge, it's between Pitcairn and the smaller Bonaire.

You need to compare either 7970 or 290 with the upcoming 3(70x?) GCN 1.x -> GCN 2.0 which is an actual architecture update (Big kepler vs. mid range maxwell - a major generation update). Your post missed the proper comparison going from the 7870 to the 260/x.

Either way, yeah it's been stagnant for the past few years, it took forever to get the actual high end updated. These puny trickle down updates are getting old. 4k will be picking up over the next year or two and they really need more gpu power.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
You need to compare either 7970 or 290 with the upcoming 3(70x?) GCN 1.x -> GCN 2.0 which is an actual architecture update (Big kepler vs. mid range maxwell - a major generation update). Your post missed the proper comparison going from the 7870 to the 260/x.

Crap, you're right. I forgot Bonaire was released as the 7790 one year after Pitcairn; I was thinking it was new at the 260X.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Comparing GTX 780 to 980 is not comparing Big Kepler to Big Maxwell, it's comparing Big Kepler to a smaller-die chip. It's "GTX 480 to GTX 560 Ti", not, "GTX 480 to GTX 580."

What you are saying is perfectly valid, we are talking about a next gen mid-range vs. previous gen flagship. But you forgot the pricing aspect:

560Ti launched at $249. Similarly, 460 1GB beat 280/285 for $230.

It's obviously unfair to expect NV to price a new mid-range architecture at $230-250 since manufacturing costs for new fabs and thus wafers have increased. However, let's say we accept your analogy once again -- 480 and 580 cost $499 and 780Ti went up to $699 or a 40% price increase.

Let's now take 980 as a 460/560Ti style replacement since you yourself admitted it to be mid-range, apply 40% price increase to $250 and we should get $350. Let's say $50 premium for performance/watt aspect that everyone is crazy about lately - $399 is really a fair price for 980, not $500-550. The 2nd reason $499 is way too much is because when NV raised 680 mid-range to $499, at least it beat 580 by 30-35%.

Also, using your 260X analogy vs. 7850/7870 doesn't apply since 260X is not a next generation product, just a refresh of GCN 1.0 but Maxwell vs. Kepler is a brand new architecture replacement akin to VLIW-4/5 to GCN or Fermi to Kepler.

----------------------------------

I am inclined to believe that the Boost clock of 980 is 1178mhz since in SLI they also tested it at that speed. This also coincides with a month earlier leak of the 870 (aka 970) which had a GPU-Z shot with 1178mhz Boost clocks. Based on the 3DMark scores, the 1150/1750 mhz 780Ti is slightly slower than an 1178mhz/1753 980. If 980 overclocks to 1.3Ghz like 750Ti, it should be about 16-18% faster than a 1.25Ghz 780Ti. Seems like if NV had access to 20nm, 980 would have achieved 680 style leap over the 780Ti.

It's quite good on a technical level, its extremely unheard of to have a situation where the same node mid-range performs similar to a top product, its normally in the realms of a node shrink.

This is no means stagnation like what we have on the CPU from Intel due to lack of AMD competition.

Agreed. If a 28nm 980 beats 780Ti at 170-180W, then a 20nm/16nm GM210 will be a major leap over 780Ti! NV could then release a shrunken GM204 in by end of 2015 at $249-299 as GTX1060 or w/e they'll call it. The prospect of a $249-299 20/16nm GTX780Ti style performance at 120-130W by end of 2015 is pretty sweet.

Exaggerating? There are not any r290 cards going for $350. There is one for $370, 3 more under $400, and 10 $400+. The ASP is right around $400, not $350.

After $40 Newegg gift card and rebates, there are a few R9 290s below $350.

HIS IceQ R9 290 - $334
Diamond R9 290 - $344

R9 290X is enjoying similar promotions.

Diamond R9 290X is $443

These 'promotions' may be a signal that AMD is ready to adjust MSRPs on 290/290X to $299-329 and $399-429, respectively. $399/$499 970/980 will be superior from a tech perspective but $80-100 is often a lot of $ to move from 1 tier to the next for power efficiency and "coolness" factor alone. I mean would you pay $499 for 980 when R9 290 is $299-329? What will be more interesting is AMD's response after R9 290/X since they will be in NV's 670/680 position of having known performance and pricing targets. Price wars are definitely in the making in the next 6-9 months :)
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
These 'promotions' may be a signal that AMD is ready to adjust MSRPs on 290/290X to $299-329 and $399-429, respectively. $399/$499 970/980 will be superior from a tech perspective but $80-100 is often a lot of $ to move from 1 tier to the next for power efficiency and "coolness" factor alone. I mean would you pay $499 for 980 when R9 290 is $299-329? What will be more interesting is AMD's response after R9 290/X since they will be in NV's 670/680 position of having known performance and pricing targets. Price wars are definitely in the making in the next 6-9 months :)

I missed the gift cards being attached to the cards. Definitely looks like AMD is prepping for price drops. If Nvidia comes in above $500 for the GTX 980, it'll be a disappointing launch for perf/$. The needle will not have moved hardly at all considering current pricing for all existing products from either vendor.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The move to 4GB is pretty important too. The lousy console port optimizations (Titanfall, Dead Rising 3, Watch Dogs) are pounding the VRAM. It seems even 4GB of VRAM may very well be a bottleneck for next gen games at 4K/multi-monitor gaming, which may prompt users targeting these resolutions to consider 6-8GB GPUs.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Dead_Rising_3-test-dr_3_vram.jpg
 

96Firebird

Diamond Member
Nov 8, 2010
5,749
345
126
Feb 19, 2009
10,457
10
76
Based on a die shot from an engineering board (see http://videocardz.com/50980/nvidia-geforce-gtx-880-pictured), everyone seems to have accepted that the GM204 die will be bigger than GK104's 294mm2. I can't find the source because I'm a little rushed, but I think the estimated die size is north of 400mm2 (I could be wrong).

Alright that makes more sense then.

Its similar to Hawaii in die size actually. Definitely 400mm2+ which dampens my excitement somewhat since I was still thinking Maxwell mid-range at ~300mm2 like Gk104 was.