• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

How good will Pascal be?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
This is all absolutely seat of the pants, but based on the demo RTG showed of Polaris playing Battlefront at the same settings and framerate as a 950 while consuming 86W at the wall vs 140W, it's likely P10 is at least twice as efficient as a 950 in certain situations. A stock 280X is about 30% faster than a stock 950, and a stock 950 pulls about 70% more power than a 750Ti.

I don't see a strong reason at this point why nVidia would have worse perf/watt than AMD in the next gen, so I wouldn't see it being big stretch for around 280X performance within a 75W envelope, especially in 2017 games.

What I'm really curious about is the CPU. Because with all the fixed consumption of other components in that 86W, there's plenty of room for it to be at least twice as efficient, and it could be a good bit more depending on CPU. Personally I'm betting on something like a mobile chip.
 
What I'm really curious about is the CPU. Because with all the fixed consumption of other components in that 86W, there's plenty of room for it to be at least twice as efficient, and it could be a good bit more depending on CPU. Personally I'm betting on something like a mobile chip.

The system specs were listed, it was a 4790k.
6bqF1Rc.png
 
The system specs were listed, it was a 4790k.

Interesting. It looks like they did drop the power limit. I wonder how low that put the wattage.

Also am I way off the mark or does that look like voltage that would normally be considered an undervolt?
 
This has been pointed out many times and is a common observation now ever since the 600 series screwed everything to hell. I don't know why you are pretending like this isn't the case. You seem to be interested in nit picking my post and pointing out little flaws when none of that crap takes away from the main point that was being made.

Probably because you are just making stuff up. The 680 was the first 600 series card Nvidia released. No faster single GPU 600 series card was ever released. The original Titan was the first 700 series card Nvidia released. Nothing completely surpassed it (though the 780Ti got close) until the Titan Black was released a year later. Which means that the 900 series is the only series where Nvidia released a mid level card before the flagship. There has been no trend established here like you are tying to claim. And it also needs to be noted that the release date for the flagship Titan X was not pushed back to make that schedule possible. Nvidia moved up the 980 release to late 2014. So you're complaining that with the current series Nvidia accelerated the release schedule for the mid level cards. That's a rather peculiar complaint.
 
Interesting. It looks like they did drop the power limit. I wonder how low that put the wattage.

Also am I way off the mark or does that look like voltage that would normally be considered an undervolt?

Not sure. Even with Platinum supplies it's hard to imagine that the rest of the system is drawing less than 40W, since that's what a 4790k system normally idles at. I'd actually be really surprised if the rest of the system drew less than 50W in that gaming load, which would make the Polaris number neigh unbelievable.
 
Probably because you are just making stuff up. The 680 was the first 600 series card Nvidia released. No faster single GPU 600 series card was ever released. The original Titan was the first 700 series card Nvidia released. Nothing completely surpassed it (though the 780Ti got close) until the Titan Black was released a year later. Which means that the 900 series is the only series where Nvidia released a mid level card before the flagship. There has been no trend established here like you are tying to claim. And it also needs to be noted that the release date for the flagship Titan X was not pushed back to make that schedule possible. Nvidia moved up the 980 release to late 2014. So you're complaining that with the current series Nvidia accelerated the release schedule for the mid level cards. That's a rather peculiar complaint.

The 680's GPU is a midrange GPU (GK104). GK110 is above it in the same family (kepler) and GK106 + GK107 are below it. That is by definition "the middle." It has a die size in between GK110 and GK106, which is the middle. Flagship by every definition is the best/fastest product in a product family. If you define the product family as Kepler architecture based consumer discrete GPU's, then the flagship is based on GK110.

"Midrange first" started with the 680 for nvidia. Where it begins w/ AMD is another discussion, but its reasonable to say the 7970 was close enough to be effectively the same situation
 
Last edited:
Probably because you are just making stuff up. The 680 was the first 600 series card Nvidia released. No faster single GPU 600 series card was ever released. The original Titan was the first 700 series card Nvidia released. Nothing completely surpassed it (though the 780Ti got close) until the Titan Black was released a year later. Which means that the 900 series is the only series where Nvidia released a mid level card before the flagship. There has been no trend established here like you are tying to claim. And it also needs to be noted that the release date for the flagship Titan X was not pushed back to make that schedule possible. Nvidia moved up the 980 release to late 2014. So you're complaining that with the current series Nvidia accelerated the release schedule for the mid level cards. That's a rather peculiar complaint.

That's all very interesting, but again, it doesn't change anything because the only thing thousands of enthusiasts have to do if they want to see the truth is to go look at their newegg order history and recall the accompanying emotions that went along with those purchases. It would go something like this:

GTX 680 purchased: This was awesome. Badass Kepler is finally here. The perfect replacement for my GTX 580. Sweet.

GTX Titan purchased: Son of a bitch. I thought I already bought Badass Kepler for $600, but I was tricked. Thanks a lot Nvidia for replacing my 580 with a $1,000.00 flagship.

GTX 780 not purchased, but added to kart: You mean I didn't have to spend $1,000? Holy *&#$. Screw you Nvidia.

(total money spent: $1,650.00 and all they wanted to do was replace their $500.00 GTX 580 like they have been doing for years)

GTX 980: rinse, repeat
 
Last edited:
Probably because you are just making stuff up. The 680 was the first 600 series card Nvidia released. No faster single GPU 600 series card was ever released. The original Titan was the first 700 series card Nvidia released. Nothing completely surpassed it (though the 780Ti got close) until the Titan Black was released a year later. Which means that the 900 series is the only series where Nvidia released a mid level card before the flagship. There has been no trend established here like you are tying to claim. And it also needs to be noted that the release date for the flagship Titan X was not pushed back to make that schedule possible. Nvidia moved up the 980 release to late 2014. So you're complaining that with the current series Nvidia accelerated the release schedule for the mid level cards. That's a rather peculiar complaint.

I absolutely love your signature... If I was a hot lady I would be PMing you right now.
 
I'd be shocked if they released the full capabilities of the Pascal architecture up front. They stopped doing that a long time ago because it made too much sense and was too good for the consumer in terms of price/perf.

They started this with the 680 when they realized their "mid range" was performing every bit as well as AMD's high end and could charge a high end price. What started this trend can just as easily end it. If AMD comes out of the gate with a product that only NVidia's high end offerings can compete with, nVidia will have no choice but to release their high end offerings ASAP.
 
They started this with the 680 when they realized their "mid range" was performing every bit as well as AMD's high end and could charge a high end price. What started this trend can just as easily end it. If AMD comes out of the gate with a product that only NVidia's high end offerings can compete with, nVidia will have no choice but to release their high end offerings ASAP.

This is correct. If it were up to me, I would have simply released the high end during the 600 series up front, charged $600.00 for it and released the amazing valued mid range line later, as was done traditionally. Imagine if they did that? AMD would have been buried so damn hard that round it would have been ridiculous. It would have been the round to remember forever where Nvidia put the beat down on them like never before and like never again.
 
Since we are all theorizing on why GK104 was released as a 680, here is another angle...

Low yields on such a large die due it 28nm being a new process, along with the 18,688 GK100 GPUs that went to Oak Ridge. Note that article was written ~6 months after the release of the 680, and about ~6months before the release of the original GeForce Titan. Maybe 18k+ GPUs isn't that much, I have no idea. But it seems like a lot to me, meaning all the viable GPUs that could have gone towards consumer GK110 were used up at Oak Ridge.
 
Since we are all theorizing on why GK104 was released as a 680, here is another angle...

Low yields on such a large die due it 28nm being a new process, along with the 18,688 GK100 GPUs that went to Oak Ridge. Note that article was written ~6 months after the release of the 680, and about ~6months before the release of the original GeForce Titan. Maybe 18k+ GPUs isn't that much, I have no idea. But it seems like a lot to me, meaning all the viable GPUs that could have gone towards consumer GK110 were used up at Oak Ridge.

That could be one part of it. There was further market segmentation, as shortly after that piece was published nVidia released the compute only Tesla K20/K20X, at the over $3000 price point. No idea how many of those they sold prior to releasing the Titan/780.
 
Doubt it - they didn't do that for Maxwell after all 🙂

If we're going to see a big pascal fairly soon then you have to imagine - with this being a compute conference and all - we'll see something about it. Compute is the area they most badly need something new in of course.

Not sure why they'd announce the low/mid range gaming stuff there, even if they are due vaguely soon.
 
Do you think it's likely they will announce a release schedule at the event? Anyone's guess?

Well, GTC last year was when they showed off a lot of details about the Titan X. They didn't announce a full release schedule, though. I doubt they will announce exact release dates very far in advance.
 
No doubt they will talk about Pascal for compute at GTC, just a matter of what level of detail they can get down to. Hopefully more than we've seen so far, and I think that its fairly likely. Working silicon? Not sure. I wouldn't be surprised if they do have Big Pascal farther along than many think if only to meet their HPC contracts (e.g. we won't see it for consumer purchase for a while even in the extremely unlikely scenario its already ready to go). Multi-million dollar contracts have a way of getting management to motivate
 
Back
Top