[Sweclockers]Geforce GTX 870 and GTX 880 coming this fall

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91

Both 20nm and 16nm designs have been worked on for a while. Not surprising considering that Nvidia said they recieved 20nm samples from TSMC when GK104 launched back in 2012.

There are also many mention of Maxwell on linkedin, but doesnt really say much.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
I would not have counted the 200 and 400 series, but rather these:

(2004) 6800gt vs 5950 ultra, 9800 Pro (higher than 50% improvement)

http://www.anandtech.com/show/1383/7

(2006) 8800gtx vs 7900gtx - much higher than 50%

http://www.anandtech.com/show/2116/22

The 8800 was probably Nvidias most significant release in last 10 years.

I'd disagree. I think full GK110 in 780ti/TB has become the most significant. If we ignore nvidia selling a midrange card as a flagship with the 680 and look at 580 to 780ti, the flagships of 40nm and 28nm respectively, it's a doubling of performance giving a 100% increase. Sometimes even more. Also the 8800GTX used more power than the 7800GTX, whereas the 780ti uses less than the 580 did.

After doing the 680 thing this is why it's not worth bothering with these new mid-range 'flagships'. I'll wait for the real Maxwell flagship on 20nm with the big die behind it. It could easily be even better than the 580 to 780ti if the power characteristics of Maxwell can be maintained on the 20nm node.
 
Last edited:

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
I'd disagree. I think full GK110 in 780ti/TB has become the most significant. If we ignore nvidia selling a midrange card as a flagship with the 680 and look at 580 to 780ti, the flagships of 40nm and 28nm respectively, it's a doubling of performance giving a 100% increase. Sometimes even more. Also the 8800GTX used more power than the 7800GTX, whereas the 780ti uses less than the 580 did.

After doing the 680 thing this is why it's not worth bothering with these new mid-range 'flagships'. I'll wait for the real Maxwell flagship on 20nm with the big die behind it. It could easily be even better than the 580 to 780ti if the power characteristics of Maxwell can be maintained on the 20nm node.

This. Exactly this. Nvidia true big die generational jumps have been great, they just now take longer with the process node slowdowns. this is exactly how i will buy in the future. skip the even number x80 and wait for odd number x80 with big die.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I'd disagree. I think full GK110 in 780ti/TB has become the most significant. If we ignore nvidia selling a midrange card as a flagship with the 680 and look at 580 to 780ti, the flagships of 40nm and 28nm respectively, it's a doubling of performance giving a 100% increase. Sometimes even more. Also the 8800GTX used more power than the 7800GTX, whereas the 780ti uses less than the 580 did.

After doing the 680 thing this is why it's not worth bothering with these new mid-range 'flagships'. I'll wait for the real Maxwell flagship on 20nm with the big die behind it. It could easily be even better than the 580 to 780ti if the power characteristics of Maxwell can be maintained on the 20nm node.
yes but technically the 780 ti does not use less power than the 580 except in furmark. in actual games the 780 ti will use more power for sure. its peaking a whopping 40 watts more than the 580 in Crysis 2. http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_780_Ti/25.html
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
It is kinda of game dependent.

But the gtx580 continued to improve over the lifespan.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
yes but technically the 780 ti does not use less power than the 580 except in furmark. in actual games the 780 ti will use more power for sure. its peaking a whopping 40 watts more than the 580 in Crysis 2. http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_780_Ti/25.html

I'm going off computerbase.de numbers. They have the Titan using slightly less than the 580. Unfortunately they don't have the 580 included any more in their newer reviews where the 780ti started showing up. Not exact science, but the extra 3GB VRAM on Titan is going to use more power and the extra enabled cluster on the 780ti will use more power, maybe they sort of balance out. :D

Performance wise, at launch the Titan was almost twice as fast as the 580. 193% of the performance was the number I believe. Now you can just compare the 660ti to the 780ti in their numbers. 660ti is a little faster than the 580 was and the 780ti is twice as fast as a 660ti.

8800GTX was the last time we saw a massive doubling in performance. The 285 and 580 were about 60-70% or so faster than the previous flagships. Now we have the 780ti twice as fast and in some cases even faster than that compared to the 580. I would put GK110 as the best flagship nvidia has ever released.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I'm going off computerbase.de numbers. They have the Titan using slightly less than the 580. Unfortunately they don't have the 580 included any more in their newer reviews where the 780ti started showing up. Not exact science, but the extra 3GB VRAM on Titan is going to use more power and the extra enabled cluster on the 780ti will use more power, maybe they sort of balance out. :D

Performance wise, at launch the Titan was almost twice as fast as the 580. 193% of the performance was the number I believe. Now you can just compare the 660ti to the 780ti in their numbers. 660ti is a little faster than the 580 was and the 780ti is twice as fast as a 660ti.

8800GTX was the last time we saw a massive doubling in performance. The 285 and 580 were about 60-70% or so faster than the previous flagships. Now we have the 780ti twice as fast and in some cases even faster than that compared to the 580. I would put GK110 as the best flagship nvidia has ever released.
actually you can see the 780 ti is using more power than the Titan in my link.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
actually you can see the 780 ti is using more power than the Titan in my link.

Titan is 6ghz ram vs 7ghz on 780ti correct? thought i remember reading that 7ghz gddr5 was a big power hog.
 

amenx

Diamond Member
Dec 17, 2004
4,428
2,751
136
I'd disagree. I think full GK110 in 780ti/TB has become the most significant. If we ignore nvidia selling a midrange card as a flagship with the 680 and look at 580 to 780ti, the flagships of 40nm and 28nm respectively, it's a doubling of performance giving a 100% increase. Sometimes even more. Also the 8800GTX used more power than the 7800GTX, whereas the 780ti uses less than the 580 did.

After doing the 680 thing this is why it's not worth bothering with these new mid-range 'flagships'. I'll wait for the real Maxwell flagship on 20nm with the big die behind it. It could easily be even better than the 580 to 780ti if the power characteristics of Maxwell can be maintained on the 20nm node.
The 680 was only mid-range in spec, not in market or product line-up during its 14 month run as Nvidias 'flagship' (680 came in 3/12 vs 780 in 5/13)? Or about the time it traditionally would take GPU makers to come out with a new product line. How can a product that virtually equals its competitions top card be a mid-range without a flagship above it to position it as such? o_O

Wonder how many people would wait over a year for a top card to arrive after a mid-range had been released? You had a 680 didnt you :D?
 
Last edited:

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
The 680 was only mid-range in spec, not in market or product line-up during its 14 month run as Nvidias 'flagship' (680 came in 3/12 vs 780 in 5/13)? Or about the time it traditionally would take GPU makers to come out with a new product line. How can a product that virtually equals its competitions top card be a mid-range without a flagship above it to position it as such? o_O

Because we all knew Nvidia was building a bigger chip. hence GK104 = not flagship Kepler. When GK104 was being developed it was intended as mid-range and was only re positioned to top end when Nvidia realized it could compete with AMD best at the time, the 7970. And based on history it was reasonable to assume that 7970 was going to be the 28nm flagship for AMD since they had never built a chip as large as Hawaii before. With Nvidia history told us we would get a 500mm2+ part

I have 670's because I was impatient, but I knew what I was getting into ie=midrange; I'll be waiting that extra year for the real flagship maxwell 980
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
How can a product that virtually equals its competitions top card be a mid-range without a flagship above it to position it as such?

Simple. GK104 was top card and "high-end" in marketing terms.
In architectural terms, this chip clearly fills the same spot as GF104/114 - ie. mid-range.

We are stumbling over our own terms. High-end, mid range...
 

amenx

Diamond Member
Dec 17, 2004
4,428
2,751
136
Simple. GK104 was top card and "high-end" in marketing terms.
In architectural terms, this chip clearly fills the same spot as GF104/114 - ie. mid-range.

We are stumbling over our own terms. High-end, mid range...
Yep, its what I meant. No ones going to care about mid or high end labelling if it competes with the competitions best at a good price. Especially if the so-called flagship is at some distant point off in the future.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
I doubt they will release 880 "mid range" overpriced cards as high end only to release full high end cards in 3 months as 980. Either 880 midrange isn't actually released as an 880 (an 860 or so), or they will milk out the 880 buyers for as long as possible not just for three months as the expensive high end.

If AMD release Pirate Islands in 20nm 3 months after GM204, I don`t think Nvidia have much choice left but to release a new chip.
20nm vs 28nm is a really unfair advantage. Of course it depends on how much performance over the GM204 AMD plans for Pirate Islands. If they plan to put the first Pirate Islands just tiny above GM204 but beat GM204 severely in power and efficiency, Nvidia might not need to release the GM200.
 

Saylick

Diamond Member
Sep 10, 2012
3,948
9,208
136
Yep, its what I meant. No ones going to care about mid or high end labelling if it competes with the competitions best at a good price. Especially if the so-called flagship is at some distant point off in the future.

This is absolutely true in that if something new gets released and it is indeed faster than the previous generation, we have a new high-end. What's changed, however, is that the high-end which replaces the old high-end will now only be 35% faster than what came before, as opposed to the 50%+ improvements we've seen in the past. The issue is that 35% improvement for the same price we've been used to paying isn't exactly good value for our wallets.

At the rate we're going, assuming AMD doesn't release big dies anymore, we're bound to see $500 GPUs which will outperform last-gen $650 GPUs, followed by the released of new $650 GPUs which will outperform said $500 GPU. I don't mean to sound like a broken record here, but all of this means that the price points we're used to seeing will be bumped up a solid $150. Sooner or later, we're all going to get used to this new pricing scheme and start believing that the $500-class GPUs are worth it because they offer last-gen $650-class performance for $500-class pricing. Can you consider the $500 GPUs as high-end when there is something faster AND more expensive down the line? Maybe so, but for what cost?

At this point, I'm hoping AMD starts bumping up the die sizes as I want to see large die GPUs duking it out, with the hope that large die pricing drops down to the $550 price point we all would love to see.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
just saying
the titan might have been twice as fast as a gtx 580 but was the same $ per fps at $1k vs $500.00 by my math.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
8800GTX also gave us DX10 from Nvidia alongside the big performance bump. GTX480 gave us DX11 from Nvidia alongside the performance bump. Maxwell 2.0 gives us full DX12 support (full or just 11.2 feature level?) alongside what rumours say is a performance bump.

If the 870/880 drops before November and has at least 780 performance for less than £350 I will be both surprised and opening my wallet at the same time.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
This is absolutely true in that if something new gets released and it is indeed faster than the previous generation, we have a new high-end. What's changed, however, is that the high-end which replaces the old high-end will now only be 35% faster than what came before, as opposed to the 50%+ improvements we've seen in the past. The issue is that 35% improvement for the same price we've been used to paying isn't exactly good value for our wallets.

At the rate we're going, assuming AMD doesn't release big dies anymore, we're bound to see $500 GPUs which will outperform last-gen $650 GPUs, followed by the released of new $650 GPUs which will outperform said $500 GPU. I don't mean to sound like a broken record here, but all of this means that the price points we're used to seeing will be bumped up a solid $150. Sooner or later, we're all going to get used to this new pricing scheme and start believing that the $500-class GPUs are worth it because they offer last-gen $650-class performance for $500-class pricing. Can you consider the $500 GPUs as high-end when there is something faster AND more expensive down the line? Maybe so, but for what cost?

At this point, I'm hoping AMD starts bumping up the die sizes as I want to see large die GPUs duking it out, with the hope that large die pricing drops down to the $550 price point we all would love to see.

also beside price nv has cripple all their cards with low vram being involved with game dev's they will know whats coming in the next 2 years so a $850.00 high end 780ti might only last a year. I hope I'm wrong ,but next year if I need new cards it won't be nv as it will the 4th set I've upgraded for the lack of vram. add to the fact they held back the 6gb 780's and no 6gb 780ti said it all about their products but the cheap ass gtx770 got the 4gb day one.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
The 680 was only mid-range in spec, not in market or product line-up during its 14 month run as Nvidias 'flagship' (680 came in 3/12 vs 780 in 5/13)? Or about the time it traditionally would take GPU makers to come out with a new product line. How can a product that virtually equals its competitions top card be a mid-range without a flagship above it to position it as such? o_O

Wonder how many people would wait over a year for a top card to arrive after a mid-range had been released? You had a 680 didnt you :D?

Sure, I had two of them. I went from 3 gtx 480s to 2 gtx 680s. In many cases the 680s were slower. Had I gone from 3 480s to 2 780ti cards, the difference would of been vast in favour of the 780ti sli.

We all know 680 was touted as flagship in marketing, but spec is all that matters. It was a mid-range card sold as high end. This is why I don't think it's worth it any more to buy into the new trend of selling mid-range cards as high end to buffer between the real high end GPUs.

The 880 will be a nice upgrade for a 680/670/770 user, but for someone on a 780, Titan, 780ti or 290(X), it's probably going to be a 20% improvement. Just my take on it experiencing 480 to 680 and the disappointment that was performance wise. Normally I'll get whatever is fastest as it comes out, but for me it's not feeling worth it as much any more. Games are pretty stagnant and 780ti SLI rips up 2560x1600.

Also with the consideration of moving to 4K, which is going to be getting more prevalent soon with the 4K monitors coming and dropping price, if these 880s are mid-range chips in spec - there is no way they will be able to handle 4K much better than the current best cards can. Historically mid-range cards are really weak at high resolutions with their lower shader count and weak bandwidth compared to the big cards.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
When the 680 was released we knew Nvidia had the GK110 chip as its mainstream, the whole forum was talking about how Nvidia rebranded their mid range gamer chip to compete favourably with the 7970. AMDs strategy at that point was still to compete with Nvidia's high end with a a dual chip card with its moderate sized GPUs, the problem is that AMD released the 7970 at a ridiculous price considering what the card was. But I think everyone that bought it and the 680 just comparing size/transistor counts and the basic spec could see these were mid range targeted cards.

What I did not expect to see was GK110 to come out a full year later as a refresh. That was unexpected. I expected those cards would have been out sooner. The very late AMD refresh with the 290 was also kind of surprising, it was nearly 20 months after the first card, what appear very close to 20nm process being ready.

This whole generation has been a mess, it was very bad for customers in general in the end, almost all of us got screwed for one reason or another. I seriously hope we don't see this again, but it looks like it might be a successful tactic for Nvidia and AMD, they sold more cards this way so they might very well hold back their big chips a year again to make the refresh better.
 

Olivier Duff

Junior Member
Jun 19, 2014
9
0
0
Should I see a significant difference with this when compared to my actual Radeon HD 7900? I have a 2560 x 1440 monitor, so I could probably use the boost in framerates.

I feel like I want to go back into NVidia territory.

Or is there a better value out there for what I'm aiming for?
 

Bubbleawsome

Diamond Member
Apr 14, 2013
4,834
1,204
146
I feel like AMD has a fighting chance this time. No one was expecting 290x and "certain people" have said the new 390x will be a bigger die than we've seen from amd yet. If they wait for a gm104 launch to beat the snot out of with a large die that would rival gm100 then that would really boost their customer base.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I feel like AMD has a fighting chance this time. No one was expecting 290x and "certain people" have said the new 390x will be a bigger die than we've seen from amd yet. If they wait for a gm104 launch to beat the snot out of with a large die that would rival gm100 then that would really boost their customer base.

Except the design for a 390x will already be nailed down by the time gm104 launches, so it's design and performance will have nothing to do with gm104's launch or performance. And if it isn't out of the design stage by the time gm104 launches, it'll be at least 9-12 months away from release making its performance vs. gm104 entirely moot.
 
Last edited:

Bubbleawsome

Diamond Member
Apr 14, 2013
4,834
1,204
146
Except the design for a 390x will already be nailed down by the time gm104 launches, so it's design and performance will have nothing to do with gm104's launch or performance. And if it isn't out of the design stage by the time gm104 launches, it'll be at least 9-12 months away from release making its performance vs. gm104 entirely moot.

I mean have 390x ready, and let the 880 go first and get destroyed a month later. If the 390x launched first then nvidia would have the time to tweak the 880, pull a TITAN Z and pull out of the spotlight, or release gm100 as 880ti instead of 980 or 980ti ala 680 where the 780ti should have been the 680ti. (This is all assuming the AMD big-die>nvidia big-die.)
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I mean have 390x ready, and let the 880 go first and get destroyed a month later. If the 390x launched first then nvidia would have the time to tweak the 880, pull a TITAN Z and pull out of the spotlight, or release gm100 as 880ti instead of 980 or 980ti ala 680 where the 780ti should have been the 680ti. (This is all assuming the AMD big-die>nvidia big-die.)

Given that Hawaii came out in November, and with it's power characteristics, there is some very major reworking to be done in order for AMD to release a bigger, faster chip on 28nm. Nvidia already has that foundation completed with gm107, but look how long it's taking them to follow up with it.

Releasing 400+mm^2 dies takes a ton of work and aren't designed from start to finish over the course of six months, especially if the current design runs incredibly hot and very, very power hungry.