Possible GTX 880 benched in 3DMark Firestrike Extreme!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
All that matters with 3dmark is GPU score when comparing cards, the overall score is pointless. This is definitely a fake. If a1300 780ti hits 7000 GPU, a 6500 GPU is very poor in a new flagship. There has never been a case of a flagship release having performance that can be achieved by overclocking the past part, unless you count refreshes like 480 to 580 or 780 to 780ti. But not actual new flagship architecture releases.

The 880 is not going to be anything phenomenal becuase of 28nm and a smaller die than GK110, but I would expect more, then again this image does put it about 20% faster than a 780ti with no adjustments... soooo maybe ? It is a unique situation with nvidia stuck on 28nm.

Pretty pathetic if sub 20% is what is delivered. If it's cheap enough it will be great, but I expect $700+

GPUs really are going the way of CPUs with meager improvements starting to become the norm.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
780 ti +30% for $500? Considering nVidia's typical pricing, that will be a real breath of fresh air. I'll be really shocked though as it will be a major reduction in nVidia's ASP. Add to that there will be a lot of GK110 cards that will turn into unsalable stock if they do that.

Well since it is a successor to GK104, if anything it should cost $500 for 4GB and $550 for 8GB. If NV raises prices above, then it's yet another price increase. A real next generation flagship from NV is 50-100% faster, so why would this card cost $650-700? If this was the GM200 chip then $700 would be reasonable.

I just dont know.

How much faster was the stock 290x over the 7970ghz?

Is it reasonable to think that Nvidia can get +30% over a 780ti while using less power?

That was with a node shrink. 28nm shrink played a huge role. It was significant yet people seem to forget this lately.

I just dont know. I know that >30% over a 780ti on the same node could happen for sure. But all while reducing power at the same time? I just dont know about that.

You have to think about it. While the gk104 accidentally became the gtx 680. It was origanlly aimed at being the 770ti but they found the chip to have turned out much more capable. It came back from the fab better than expected. It was originally designed for much more conservative clocks but they quickly found it was much more capable. After AMD launched the HD7970, nvidia pushed the gk104 up to notch out a win over it. But it was purely luck. Nvidia engineers were blessed with the gk104 on the 28nm node.

290X was about 30-33% faster than 7970Ghz.
http://www.computerbase.de/2014-02/nvidia-geforce-gtx-750-ti-maxwell-test/5/

680 was no luck - that was solid NV engineering. Why can't GM204 beat 780Ti by 30% at 185-200W TDP? Maxwell gets 2x the performance/watt. Based on leaks, the chip is 400-430mm2 not 294mm2 like Kepler 680 was. We already know from 750Ti that 70-100% increase in performance/watt is proven for this architecture. Beating 780Ti at 185-200W TDP for this part should be an easy target. If leaked scores showed 50%+ increase, then doubts would be justifiable due to a narrow bus and pixel fillrate bottleneck, or if this chip was sub-300mm2 then it would also be less believable but these aren't the case from leaks.
 
Last edited:

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Driver support for these Quadro cards was just added to a Nvidia driver:

NVIDIA_DEV.0FF3 = "NVIDIA Quadro K420"
NVIDIA_DEV.103C = "NVIDIA Quadro K5200"
NVIDIA_DEV.11B4 = "NVIDIA Quadro K4200"
NVIDIA_DEV.13BA = "NVIDIA Quadro K2200"
NVIDIA_DEV.13BB = "NVIDIA Quadro K620"
http://forums.laptopvideo2go.com/topic/31030-inf-v5009/

Quadro K2200M is confirmed GM107 Maxwell.

Good chances that K4200 and K5200 will be GM204. Might be revealed at SIGGRAPH in mid August.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Yup surely these are GK104, they are not gonna replace the K6000 so soon which will piss off a lots of customers including ours.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
I could be very wrong here, but since K2200M is Maxwell, I`m assuming the rest of the Kx2xx is too.

Why would they push out more Quadro cards based on Kepler? They have like a gazillion out there on mobile and desktop.
And wasnt it Quadro cards that first came out with Kepler, before consumer Kepler GTX cards hit the market?

I could see them release a Quadro Kepler card to take on FirePro W9100, but because of the K5200 name I guess thats not gonna happen (and beat K6000). What audience would K5200 be targeted toward and what card are they looking to replace or take on?
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
I am sorry man I meant GM104 damn that K :D yup the first card based on GK104 was K5000 which didn't replace the Quadro 6000 which was then the top of the line quadro .It took GK110 to take away the crown from 6000, so I think I am seeing a pattern here first one based on new architecture will be very good on performance/price ration and next one will be the real successor.
 

Mand

Senior member
Jan 13, 2014
664
0
0
I am sorry man I meant GM104 damn that K :D yup the first card based on GK104 was K5000 which didn't replace the Quadro 6000 which was then the top of the line quadro .It took GK110 to take away the crown from 6000, so I think I am seeing a pattern here first one based on new architecture will be very good on performance/price ration and next one will be the real successor.

There is no GM104. There is GM204, and GM200. GM107 is the only GM1** produced, and that went into the 750 Ti.
 

Pinstripe

Member
Jun 17, 2014
197
12
81
Where is the GM206, and was there ever any? Or is Nvidia just going to rebrand Kepler for the GTX 850-860 once more?
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
There is no GM104. There is GM204, and GM200. GM107 is the only GM1** produced, and that went into the 750 Ti.

When did they change it? i thought it was like GF104->Gk104->Gm104... anyways thanks I stand corrected.
 

Mand

Senior member
Jan 13, 2014
664
0
0
When did they change it? i thought it was like GF104->Gk104->Gm104... anyways thanks I stand corrected.

When GM107 came out last winter, and no other products. It was very clearly a one-off to test their new architecture on the existing process. The actual full new generation is the 800 series, which they decided to use GM2** for.

Why they decided to do that, I have no idea, but it's what they did.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
GM2xx represents 2nd generation Maxwell. Calling it GM1xx would signal 1st generation and they clearly wants to seperate 1st and 2nd for some reason when you look at the marketing slides.

So if K2200M is GM107, so will K2200.
Making K4200 and K5200 with GM107 makes no sense since K2200 exist. And since only GM204 and GM200 exist it got to be GM204.

Kepler based this late makes zero sense. And what other way is better than to start 2nd gen Maxwell than with Quadro just like Kepler did.

There is a GPGPU fair coming soon called SIGGRAPH, where Nvidia announced Quadro K6000 first earlier. So GM204 Quadro around 13th August, then GM204 GTX 880 a day after or so at Gamecom like the leaker in first page talked about

Fingers crossed
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Where is the GM206, and was there ever any? Or is Nvidia just going to rebrand Kepler for the GTX 850-860 once more?

Might come sometime next year but there was talks that Nvidia is making 4 different GM204 SKUs (GPUs) to make up for the lack of midrange chips like GM206

Seems plausible...
Just speculation from my side but the countdown from Nvidia UK that ends in August 13th/14th, that its still a countdown while Nvidia UK just wrote about the Nvidia Shield tablet today, makes it very plausible something is gonna go down that month about an unreleased GTX card.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
A wise man once told me, "disappointment is a function of expectation". :)

That's a great saying, but I'm not sure if it applies here.

Because see here people expecting 15-25% for 880Ti, but then suddenly being disappointed with 880's 28%
bonkmvf7y.gif
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Hmmmmm, I don`t know what to believe anymore.

Apparantly the Quadro K5200 is a GK110. K5000 is a GK104 so I dont even know why they both are in the 5000 series.

Quadro K4200 is GK104, put along with a GK106 K4000.

That is one thing, why the hell would the make another series of Quadro cards based on Kepler? Its been almost 2.5 years since the first Kepler Quadro`s came out FFS.
Unless those entries linked to above isnt 100% confirmed. What a crazy world
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
That's a great saying, but I'm not sure if it applies here.

Because see here people expecting 15-25% for 880Ti, but then suddenly being disappointed with 880's 28%
bonkmvf7y.gif

Serious question, not trolling: Do a lot of people just upgrade for the sake of upgrading/to have a newer/larger number in their PC? :p

A 25-28% faster 880Ti does nothing to fix the situation where we need more GPU power. Conversely, for 1080P, 780Ti slices through mostly everything else. That's why I just can't get excited even if the 880Ti is 28% faster for $550 over 780Ti.

For power users on 1440/1600P, 15-30% is still not good enough. For a real game changer beyond 290X/780Ti, we need 50-100% faster to make the gaming experience significantly better.

ac4_2560_1600.gif

crysis3_2560_1600.gif

farcry3_2560_1600.gif

tombraider_2560_1600.gif


I realize it'll be different for someone coming from 470/480/570/580 though.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Serious question, not trolling: Do a lot of people just upgrade for the sake of upgrading/to have a newer/larger number in their PC? :p

A 25-28% faster 880Ti does nothing to fix the situation where we need more GPU power. Conversely, for 1080P, 780Ti slices through mostly everything else. That's why I just can't get excited even if the 880Ti is 28% faster for $550 over 780Ti.

There's no mention of Ti in the OP. It's just a regular 880. With the usual safeguard - if we are to believe.

I am excited about the Maxwell. Simply because of what we've seen from 750 Ti.
Which imho is the best little card since since HD4770. No - since forever.
And I am excited about pure architectural gains on the same node.
GPUs are said to be mature tech and conversely we haven't seen these kinds of efficiency gains in a very long time.

And we have to repeat this over and over - efficiency means performance.

Games are whole another issue.
I happily play turn based 2D :biggrin: , or I'll do 4xSSAA on several years old shooter and still ran out of gas on my 290.
Too much performance is never too much :ninja:

28% from little-big die is on high side of my expectations.
See I voted here 45-50% for 880Ti, because I don't think NV would make x80Ti from x04 part
 
Last edited:

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
There's no mention of Ti in the OP. It's just a regular 880. With the usual safeguard - if we are to believe.

28% from little-big die is on high side of my expectations.
See I voted here 45-50% for 880Ti, because I don't think NV would make Ti from x04 part

Would be pretty epic if the bench in OP is from a GTX 880 not using a full GM204 while GTX 880 Ti have a little more cores and power :cool:
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
Serious question, not trolling: Do a lot of people just upgrade for the sake of upgrading/to have a newer/larger number in their PC? :p

A 25-28% faster 880Ti does nothing to fix the situation where we need more GPU power. Conversely, for 1080P, 780Ti slices through mostly everything else. That's why I just can't get excited even if the 880Ti is 28% faster for $550 over 780Ti.

For power users on 1440/1600P, 15-30% is still not good enough. For a real game changer beyond 290X/780Ti, we need 50-100% faster to make the gaming experience significantly better.


I realize it'll be different for someone coming from 470/480/570/580 though.

Having the common sense going on the rumours of 28nm again and a die size of 400mm2 to not expect much is just being reasonable. Those two rumours together said to expect disappointment in the performance improvement category. GPUs are going the way of CPUs with unremarkable performance gains every generation. Unlike how we use to get huge jumps in the past. Traditionally there never would of been a 680, we'd go from 580 right to 780 with a refresh bringing a 780ti. 880 from 780ti looks to be even worse than 580 to 680. The most unfortunate part of this is it is going to become the norm now, nvidia and AMD are at the mercy of TSMC and what they can afford to produce while remaining profitable. While Intel pushes forward to the most cutting-edge fab tech they are capable of producing, GPUs are dropping behind and not taking advantage of the newest process tech available. So we no longer reap the big rewards a node shrink brings.

This is not even accounting for the need to take a random photoshopped image off the web as gospel. That is something I don't even relate to or understand :ninja:

The performance gain is too small, if this image was real, the improvement is worse than from 580 to 680. While it's expected given 28nm and a smaller die, it's expected disappointment as someone who would actually buy the card wanting an upgrade, rather than someone wanting to find the positives in it regardless of everything...
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Would be pretty epic if the bench in OP is from a GTX 880 not using a full GM204 while GTX 880 Ti have a little more cores and power :cool:

256bit beating 384 and 512bit GPUs. Hows that for exciting?
Does not matter if it crushes everything in benchmarks - I can already see people screaming bloody murder over 256bit MC

Besides perf gains on same node are way sexier than those from node shrink :oops:
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
No shrink this time though. I'm interested to see how much performance we get (and at what price, of course.).

Well, if GM204 is at ~400mm^2 then im expecting to be like the GK104 over GK110. That is, the GM204 will sacrifice GPGPU performance for Gaming and smaller die size when GM200 will be the high end SKU.