[Various] NVIDIA GeForce GTX 1080 Review Thread

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Thanks for the link. I'm going to subscribe to this YouTube channel, this is a very good piece of unique content. Props to them for testing this out.

Yerp, agreed. I was speculating that the EVGA Hybrid would fit and good to see not long after reviews are out someone has the same idea and means to test it.

That video saved be jumping the gun and buying a Ref PCB + EVGA Hybrid.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
So:
12% best case scenario for the 980Ti
24% the usual speed-up
That's how it goes in my opinion when it comes to over-clocked performance. I'd like to hear your take on it.

Pretty much agree with this.

So we're looking at a 30-35% improvement over the 980 Ti at stock, and a 25% improvement once overclocking is included. Hopefully this will go back up to 30-35% once we get some decent aftermarket cards.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
Pretty much agree with this.

So we're looking at a 30-35% improvement over the 980 Ti at stock, and a 25% improvement once overclocking is included. Hopefully this will go back up to 30-35% once we get some decent aftermarket cards.

Remember that's day one performance at release day. There really will be driver destruction brought down on Maxwell's performance in the coming months. I'm not saying that to trash Nvidia. I'm saying it because I find it to be highly likely to actually happen. A year from now I expect that 25-30% lead to grow to more than 50-60%. They don't want people choosing cheap used 980ti's over their new cards, so they will just wreck them.
However, after they did that to Kepler and everyone caught on, there will be many eyes watching what happens to Maxwell performance. Any unexplained anomalies will be brought right into the spot light and examined under a microscope. If they do it again, you can expect them to get absolutely trashed for it...OR...people won't care and they will just buy Pascal only to have it happen again when Volta comes out.
 
Last edited:

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Remember that's day one performance at release day. There really will be driver destruction brought down on Maxwell's performance in the coming months. I'm not saying that to trash Nvidia. I'm saying it because I find it to be highly likely to actually happen. A year from now I expect that 25-30% lead to grow to more than 50-60%. They don't want people choosing cheap used 980ti's over their new cards, so they will just wreck them.

I don't think Nvidia will intentionally "wreck" Maxwell as such, but they will almost certainly give it less attention, which given that we are on the verge of a new API (DX12) taking over plus VR, is potentially just as bad.
 

flopper

Senior member
Dec 16, 2005
739
19
76
Remember that's day one performance at release day. There really will be driver destruction brought down on Maxwell's performance in the coming months. I'm not saying that to trash Nvidia. I'm saying it because I find it to be highly likely to actually happen. A year from now I expect that 25-30% lead to grow to more than 50-60%. They don't want people choosing cheap used 980ti's over their new cards, so they will just wreck them.

nah enjoy the old 980ti now while you can :)
 

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
Remember that's day one performance at release day. There really will be driver destruction brought down on Maxwell's performance in the coming months. I'm not saying that to trash Nvidia. I'm saying it because I find it to be highly likely to actually happen. A year from now I expect that 25-30% lead to grow to more than 50-60%. They don't want people choosing cheap used 980ti's over their new cards, so they will just wreck them.
However, after they did that to Kepler and everyone caught on, there will be many eyes watching what happens to Maxwell performance. Any unexplained anomalies will be brought right into the spot light and examined under a microscope. If they do it again, you can expect them to get absolutely trashed for it...OR...people won't care and they will just buy Pascal only to have it happen again when Volta comes out.

I expect it to happen as well. However I'm not going to reward Nvidia and switch to a Pascal card right now. If the same thing happens to Maxwell, I won't be giving Big Pascal any consideration and will probably go with Vega.
 
Mar 10, 2006
11,715
2,012
126
I expect it to happen as well. However I'm not going to reward Nvidia and switch to a Pascal card right now. If the same thing happens to Maxwell, I won't be giving Big Pascal any consideration and will probably go with Vega.

Maxwell architecture is very similar to Pascal architecture, can't "gimp" Maxwell without gimping Pascal, AFAICT.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Maxwell architecture is very similar to Pascal architecture, can't "gimp" Maxwell without gimping Pascal, AFAICT.
Well maybe as far as you are concerned.
It's not about gimp in maxwell it's about finding things pascal excels at better than maxwell and using that to create features that promote pascal.

This is Nvidia not a bunch of idiots running a company they are more than capable of creating a new game works feature that is catered towards pascal..
 
Mar 10, 2006
11,715
2,012
126
Well maybe as far as you are concerned.
It's not about gimp in maxwell it's about finding things pascal excels at better than maxwell and using that to create features that promote pascal.

This is Nvidia not a bunch of idiots running a company they are more than capable of creating a new game works feature that is catered towards pascal..

Eh, I don't think that GameWorks was ever designed to do the sinister things that many people claim it does.

Maxwell is a very different architecture from Kepler, the SMs are vastly different, the ratio of compute units to texture and geometry units is way different, etc. This means that code that's optimized to run really well on something like Maxwell may run into problems on something like Kepler.

Anyway, I know it's fun for the Forum Conspiracy Theorists to just think NVIDIA is screwing its customers for an extra buck today at the expense of potentially terrible sentiment towards the company + bad press, but I seriously doubt that the engineering teams are doing what some people are claiming that they're doing.

I'll make the claim now: I do not believe Maxwell will age materially worse relative to Pascal. If I'm wrong, I'm sure this'll wind up in a bunch of signatures as a (hopefully friendly) "you were wrong."
 

sirmo

Golden Member
Oct 10, 2011
1,014
391
136
You guys are being naive if you think Gameworks and Gaming Evolved isn't about control over how the products are being perceived.

Black-box approach has never been beneficial to consumers.
 

know of fence

Senior member
May 28, 2009
555
2
71
It's great that Nvidia did the work to basically multi window projection from different perspectives into the virtual world. Because tripple displays weren't really viable before. And hopefully multi monitors setups can also idle without consuming hundreds of watts...

It's curious that a gaming PC built today basically only relies on the benefits of high frequency RAM and high default core clocks. When the original Titan was released early 2013 I remember the review gushing about the first 7 billion transistor IC that you could purchase for a "cool grand". That number hasn't changed much from Keppler, to Maxwell (8 billion 980ti) to Pascal (7.2). Meanwhile the clocks pretty much doubled from 837 MHz of the original titan to 1607MHz base clock of the 1080.
The only surprisingly low number is a TDP of 180 W, which scaled up to the usual 250W cooling capacity limit equates to exactly a cool 10 Billion transistors for the "big Pascal" future.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Got the flier from Microcenter, GTX 1080 available in stores May 27th.

So tempted!!!! OMG!!! Maybe I'll do a weasel, buy it, then return it before return period is up. D:
 

know of fence

Senior member
May 28, 2009
555
2
71
We already know big pascal GP100 has 15.2 billion transistors

OK, thanks for the correction. I looked it up. Apparently also a TDP of 300 W, 3.584 shader cores (x 1.4 core count). 610 mm² size (from 314mm² of the GTX 1080) and slightly lower clocks 1328 base /1480 MHz Boost.

https://images.nvidia.com/content/pdf/tesla/whitepaper/pascal-architecture-whitepaper.pdf

Well it did take a year for the original Titan to be released after first 28nm products were launched and additional 9 months for the Titan to materialize in the consumer space as the 780 Ti at a somewhat reasonable MSRP.
Also the GTX 600 series on 28 nm topped out at 3.5 Billion transistors (GTX 680,) which is very much in line with the history repeating itself, and the numbers doubling according to Moore Law (or rather Mooores Slow).
 
Last edited:

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
GP100 will never make it into a consumer product. Clock 1080 higher and you get about the same SP FLOPS in half the die area.

Why can't a consumer GP100 clock higher too? GM200 and GM204 reach very similar max boost overclocks.

I see no reason not to expect a 3840SP Titan neXt and likely a 3584SP 1080Ti (perhaps less if they want to further the gap just like 1070 does). Like Maxwell, they will be clocked a bit less at stock but should reach similar max OC speeds. HMB2 compared to GDDR5X will also offer a greater bandwidth advantage for the big Pascal chip compared to what 384-bit had over 256-bit for Maxwell.

I don't see any reason GP100 cannot occupy roughly the same relative performance tier GM200 has.

Unless there's evidence of Professional GP100 having a MHz limit. I'm not really following it TBH.
 
Mar 10, 2006
11,715
2,012
126
Why can't a consumer GP100 clock higher too? GM200 and GM204 reach very similar max boost overclocks.

I see no reason not to expect a 3840SP Titan neXt and likely a 3584SP 1080Ti (perhaps less if they want to further the gap just like 1070 does). Like Maxwell, they will be clocked a bit less at stock but should reach similar max OC speeds. HMB2 compared to GDDR5X will also offer a greater bandwidth advantage for the big Pascal chip compared to what 384-bit had over 256-bit for Maxwell.

I don't see any reason GP100 cannot occupy roughly the same performance gap.

Unless there's evidence of Professional GP100 having a MHz limit. I'm not really following it TBH.

P100 is rated at a monstrous 300W TDP and a lot of the silicon die is wasted in double precision units. For gaming not only is this probably not going to give you a big boost over 1080, but from a cost perspective it'd be obscene.

No, "Big Pascal" for gamers will look different than GP100. Expect a smaller die than P100 with all of the DP crap stripped out.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Oh, I got ya now. I thought you meant there would be no Big Pascal at all for consumer. You're right they should have something more efficient for gaming than GP100.
 

CakeMonster

Golden Member
Nov 22, 2012
1,623
803
136
Yep, they can still make it smaller (cheaper to produce) but easily increase the cores and memory b/w. I expect NV to easily hit another 30% increment to charge us all over again. Typically the big chip is more expensive to customers too. They don't even need hbm2, 384bit gddr5x will probably be used if it turns out to be cheaper.