• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[SemiAccurate] Tesla K20 specs: 13 SMX, GeForce probably 12-13 SMX

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
This again? Really? http://en.expreview.com/2010/08/09/world-exclusive-review-512sp-geforce-gtx-480/9070.html OH NOES! THE GTX480 drew 8 million more watts when fully unlocked! There is NO WAY IT CAN EVER BE RELEASED. EVER. Except that it was released six months later after a respin, it was named gtx580, and it came with lower power consumption and 15-20% more performance. WEIRD.


Launch GTX480 != late GTX480:


http://forums.anandtech.com/showthread.php?t=2110060

People never care for the facts 😉
 
Why stop at 250W for consumer GK110? I see no reason for this. $600 graphics cards have their own rules and dont abide by environmental "green is good" crap or "maybe this guy doesnt have a good enough PSU" concerns. Why not push it to 300W?

You tell me a GK110 running at 250W is going to be beastly.. i tell you nay. Thats not much higher than gk104, and assuming gk104 is highly efficient for gaming and obviously gk110 isnt.. you can figure out the rest and why it needs to be 300W to be really good.
 
People have been yapping about power consumption with Fermi. Nvidia will most likely not exceed 250W. I'm good with this, b/c I don't think they should buy more performance with ever more power each new generation. They will roughly double performance of the GTX580 with the same TDP which is excellent.
 
You tell me a GK110 running at 250W is going to be beastly.. i tell you nay. Thats not much higher than gk104, and assuming gk104 is highly efficient for gaming and obviously gk110 isnt.. you can figure out the rest and why it needs to be 300W to be really good.

Not really.
Compare K20X with a maximal power consumption of 235 watt to the GTX680 with 190:
SP: +22%
TMU: +22%
BW: +30%
Pixel: +22%
ROPs: +5%
Rasterizing: -13%
Geometry: +22
L2 Cache: +100%

I guess average performance increase should be around 30% because of the bandwidth advantage from 30% over the GTX680 even with only 22% of the rest.

TDP increases only 23%.
 
higher than gk104, and assuming gk104 is highly efficient for gaming and obviously gk110 isnt.. you can figure out the rest and why it needs to be 300W to be really good.

perfwatt.jpg


GF110 had very little difference in perf/watt in gaming despite having a large difference in die size and transistors. Perf/mm^2 should be the biggest thing GK110 will noticeably suffer with when it comes to gaming. Perf/w should be similar, if not better (gtx580 had the same or better perf/watt than gtx460) since it'll be out on a more mature node with some of the improvements and lessons learned from making the smaller Keplers. Nvidia won't push the boundaries of power draw with a GK110 desktop part and they don't need to.
 
See, it makes sense when you compare it Fermi vs Fermi, since the architecture is essentially the same.

GK104 and GK110 is very very different. IF (not even up to debate imo) the status quo is that NV made GK104 very efficient for gaming by stripping compute... now you have GK110 with everything, it cannot be just as efficient.
 
See, it makes sense when you compare it Fermi vs Fermi, since the architecture is essentially the same.

GK104 and GK110 is very very different. IF (not even up to debate imo) the status quo is that NV made GK104 very efficient for gaming by stripping compute... now you have GK110 with everything, it cannot be just as efficient.

No it's actually pretty similar. GF114 was also stripped in compute (although not as much GK104), but the biggest savings in taking out the compute transistors when comparing gaming metrics is DIE SIZE, not power consumption. That is why Nvidia was able to make 80-100% performance improvement over GF114 while simultaneously decreasing die size. As I already said, GK110 WON'T be as efficient in perf/mm^2 but when it comes to gaming and power consumption, all the compute dedicated transistors with GK110 won't be stressed - just like in GF110, which is why the GF110 520mm^2 die was about the same in perf/watt efficiency as the GF114 330mm^2 die. In other words, as a gaming chip it will have quite a bit of wasted die space, but since it will be performing the exact same calculations in games as GK104 does now, it should perform similar in perf/watt.
 
Last edited:
No it's actually pretty similar. GF114 was also stripped in compute (although not as much GK104), but the biggest savings in taking out the compute transistors when comparing gaming metrics is DIE SIZE, not power consumption. That is why Nvidia was able to make 80-100% performance improvement over GF114 while simultaneously decreasing die size. As I already said, GK110 WON'T be as efficient in perf/mm^2 but when it comes to gaming and power consumption, all the compute dedicated transistors with GK110 won't be stressed - just like in GF110, which is why the GF110 520mm^2 die was about the same in perf/watt efficiency as the GF114 330mm^2 die. In other words, as a gaming chip it will have quite a bit of wasted die space, but since it will be performing the exact same calculations in games as GK104 does now, it should perform similar in perf/watt.

This made a lot of sense, i can see perf/w being good at expense of perf/mm2.
 
Well, back to the title, Charlie was right, the K20 is 13SMX. When questioned about ORNL's being 14SMX he said that they could have a different model, which too is correct. They have the K20X.

Is this right, or have I missed something?
 
In this quote, Charlie states 12,13,14,15 . How could he not get the right answer?

Of course they could have some special chips to them look better (than they are?).

In fact, they could possibly pull out a sub-bin of low power chips that would make them look better. If there are enough parts for a 14SMX consumer part, a 15 SMX K20 should be an easy one to make. You can see the pattern enough to know that a 13SMX compute card means yields support a 12 or 13SMX consumer variant.
 
Yeah, you missed the point where he called the K20 the "big compute version of GK110":
http://semiaccurate.com/2012/11/02/nvidia-tesla-k20-specs-gives-hints-about-28nm-yields/

Really, the guy has no information. He knows nothing.

This is what I was talking about. He was right that the K20 is 13SMX. When confronted with the reported specs for Titan's cards, he stated that it might have higher binned cards, which it does.

charlie said:
Did you ever consider that supercomputers may get parts that are not the same as you can buy off the shelf? Think about what that might mean in the context of poor yields. Hint: If a company suddenly needs 20K units at one bin above where things are yielding, and is contractually obliged to do it after screwing the pooch on the last generation, what might the result look like?

Discuss.

-Charlie

Disregarding his diatribe about yields (which we have no way to confirm) him saying that K20 is 13SMX is accurate. Hate on him as much as you want to when he's wrong. Hating on him and saying he knows nothing when he was actually correct, just makes your statement biased.
 
He was not right. He quoted heise.de which got their information from a swiss reseller of workstation stuff.
That forum post is nonsense, too. He said that after the information was leaked by the HPC site and Ryan Smith.

Funny that someone is supporting him after he showed that he had no informations.
 
Is the K20X something "you can buy off the shelf" like the K20? If so then he wasn't right.

His article was about the K20. His response about the "K20X" was to explain why ORNL's specs were higher. He was just saying that the report of the K20 being 13SMX was correct and that ORNL being 14SMX would mean they had different cards. He obviously had no knowledge of the K20X, but was sticking with the K20 being 13SMX.
 
Its easy to be "right" when you make articles about almost all possible outcome. And then you can always point to the one that actually hit and say; "I told you so!".

Reading SA is like readinga gossip magazine about celebrities.

Utter BS and crap to make stupid people keep clicking his site.
 
He obviously had no knowledge of the K20X, but was sticking with the K20 being 13SMX.
And that's why so few people are considering him "right" in this case. If he had any inside information he would have known about K20X. Instead he's relying on other sources, and because of his one-sided feud with NVIDIA, parrots the worst case numbers.
 
And that's why so few people are considering him "right" in this case. If he had any inside information he would have known about K20X. Instead he's relying on other sources, and because of his one-sided feud with NVIDIA, parrots the worst case numbers.

Charlie made no claim to personal sources. He obviously had confidence in the reference article. That aside, what I'm talking about is those who said he was wrong reporting that K20 was 13SMX, when in reality it is 13SMX.

The other reason "why so few people are considering him "right"" is that many of them would say that regardless of anything. They just say it purely to discredit what he writes. I'm merely pointing out that 13SMX for the K20 is correct and the cards that ORNL has are 14SMX because they are a different card. Not because Charlie was wrong. Which is what many assumed.

As far as anything else he wrote, we'll have to wait and see. It could all be BS. We could never even see a Geforce K110, never mind it being 12 or 13 or 14 or 15 SMX. I would be surprised though if we didn't see one and it wasn't 13SMX. As I said, time will tell.
 
In this quote, Charlie states 12,13,14,15 ** How could he not get the right answer?

Quote:
In fact, they could possibly pull out a sub-bin of low power chips that would make them look better** If there are enough parts for a 14SMX consumer part, a 15 SMX K20 should be an easy one to make** You can see the pattern enough to know that a 13SMX ***pute card means yields support a 12 or 13SMX consumer variant**

Of course they could have some special chips to them look better (than they are?)**

But that's ***pletely backwards** If NV makes a consumer GK110 AIB, they will be able to use more SMXs and/or higher clocks** Power requirements, duty cycles, reliability, etc** are less constraining in consumer cards than in those intended for professional markets**


Edit:What the heck is going on with all the stars in the text???
 
Last edited:
Back
Top