Go Back   AnandTech Forums > Hardware and Technology > Video Cards and Graphics

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Home and Garden
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2014
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 11-12-2012, 12:58 PM   #151
Lonbjerg
Banned
 
Join Date: Dec 2009
Location: Denmark
Posts: 4,426
Default

Quote:
Originally Posted by tviceman View Post
This again? Really? http://en.expreview.com/2010/08/09/w...-480/9070.html OH NOES! THE GTX480 drew 8 million more watts when fully unlocked! There is NO WAY IT CAN EVER BE RELEASED. EVER. Except that it was released six months later after a respin, it was named gtx580, and it came with lower power consumption and 15-20% more performance. WEIRD.

Launch GTX480 != late GTX480:


http://forums.anandtech.com/showthread.php?t=2110060

People never care for the facts
Lonbjerg is offline   Reply With Quote
Old 11-12-2012, 01:42 PM   #152
Silverforce11
Diamond Member
 
Silverforce11's Avatar
 
Join Date: Feb 2009
Location: Australia
Posts: 4,960
Default

Why stop at 250W for consumer GK110? I see no reason for this. $600 graphics cards have their own rules and dont abide by environmental "green is good" crap or "maybe this guy doesnt have a good enough PSU" concerns. Why not push it to 300W?

You tell me a GK110 running at 250W is going to be beastly.. i tell you nay. Thats not much higher than gk104, and assuming gk104 is highly efficient for gaming and obviously gk110 isnt.. you can figure out the rest and why it needs to be 300W to be really good.
__________________

Rig 1: 3570K | Z77 E4 | Crossfire R290 | 840 250GB + 840EVO 250GB | 8GB G.Skill Ares 2133 | OCZ 850W Gold+ | Nanoxia DS1 | Ghetto Water
Hobby: Mobile Game Dev & Cryptocoin day-trader
Silverforce11 is offline   Reply With Quote
Old 11-12-2012, 02:06 PM   #153
boxleitnerb
Platinum Member
 
Join Date: Nov 2011
Posts: 2,534
Default

People have been yapping about power consumption with Fermi. Nvidia will most likely not exceed 250W. I'm good with this, b/c I don't think they should buy more performance with ever more power each new generation. They will roughly double performance of the GTX580 with the same TDP which is excellent.
boxleitnerb is offline   Reply With Quote
Old 11-12-2012, 02:13 PM   #154
sontin
Platinum Member
 
Join Date: Sep 2011
Posts: 2,256
Default

Quote:
Originally Posted by Silverforce11 View Post
You tell me a GK110 running at 250W is going to be beastly.. i tell you nay. Thats not much higher than gk104, and assuming gk104 is highly efficient for gaming and obviously gk110 isnt.. you can figure out the rest and why it needs to be 300W to be really good.
Not really.
Compare K20X with a maximal power consumption of 235 watt to the GTX680 with 190:
SP: +22%
TMU: +22%
BW: +30%
Pixel: +22%
ROPs: +5%
Rasterizing: -13%
Geometry: +22
L2 Cache: +100%

I guess average performance increase should be around 30% because of the bandwidth advantage from 30% over the GTX680 even with only 22% of the rest.

TDP increases only 23%.
sontin is offline   Reply With Quote
Old 11-12-2012, 02:16 PM   #155
tviceman
Diamond Member
 
Join Date: Mar 2008
Posts: 5,013
Default

Quote:
Originally Posted by Silverforce11 View Post
higher than gk104, and assuming gk104 is highly efficient for gaming and obviously gk110 isnt.. you can figure out the rest and why it needs to be 300W to be really good.


GF110 had very little difference in perf/watt in gaming despite having a large difference in die size and transistors. Perf/mm^2 should be the biggest thing GK110 will noticeably suffer with when it comes to gaming. Perf/w should be similar, if not better (gtx580 had the same or better perf/watt than gtx460) since it'll be out on a more mature node with some of the improvements and lessons learned from making the smaller Keplers. Nvidia won't push the boundaries of power draw with a GK110 desktop part and they don't need to.
tviceman is offline   Reply With Quote
Old 11-12-2012, 02:28 PM   #156
Silverforce11
Diamond Member
 
Silverforce11's Avatar
 
Join Date: Feb 2009
Location: Australia
Posts: 4,960
Default

See, it makes sense when you compare it Fermi vs Fermi, since the architecture is essentially the same.

GK104 and GK110 is very very different. IF (not even up to debate imo) the status quo is that NV made GK104 very efficient for gaming by stripping compute... now you have GK110 with everything, it cannot be just as efficient.
__________________

Rig 1: 3570K | Z77 E4 | Crossfire R290 | 840 250GB + 840EVO 250GB | 8GB G.Skill Ares 2133 | OCZ 850W Gold+ | Nanoxia DS1 | Ghetto Water
Hobby: Mobile Game Dev & Cryptocoin day-trader
Silverforce11 is offline   Reply With Quote
Old 11-12-2012, 02:36 PM   #157
tviceman
Diamond Member
 
Join Date: Mar 2008
Posts: 5,013
Default

Quote:
Originally Posted by Silverforce11 View Post
See, it makes sense when you compare it Fermi vs Fermi, since the architecture is essentially the same.

GK104 and GK110 is very very different. IF (not even up to debate imo) the status quo is that NV made GK104 very efficient for gaming by stripping compute... now you have GK110 with everything, it cannot be just as efficient.
No it's actually pretty similar. GF114 was also stripped in compute (although not as much GK104), but the biggest savings in taking out the compute transistors when comparing gaming metrics is DIE SIZE, not power consumption. That is why Nvidia was able to make 80-100% performance improvement over GF114 while simultaneously decreasing die size. As I already said, GK110 WON'T be as efficient in perf/mm^2 but when it comes to gaming and power consumption, all the compute dedicated transistors with GK110 won't be stressed - just like in GF110, which is why the GF110 520mm^2 die was about the same in perf/watt efficiency as the GF114 330mm^2 die. In other words, as a gaming chip it will have quite a bit of wasted die space, but since it will be performing the exact same calculations in games as GK104 does now, it should perform similar in perf/watt.

Last edited by tviceman; 11-12-2012 at 02:45 PM.
tviceman is offline   Reply With Quote
Old 11-12-2012, 02:54 PM   #158
Silverforce11
Diamond Member
 
Silverforce11's Avatar
 
Join Date: Feb 2009
Location: Australia
Posts: 4,960
Default

Quote:
Originally Posted by tviceman View Post
No it's actually pretty similar. GF114 was also stripped in compute (although not as much GK104), but the biggest savings in taking out the compute transistors when comparing gaming metrics is DIE SIZE, not power consumption. That is why Nvidia was able to make 80-100% performance improvement over GF114 while simultaneously decreasing die size. As I already said, GK110 WON'T be as efficient in perf/mm^2 but when it comes to gaming and power consumption, all the compute dedicated transistors with GK110 won't be stressed - just like in GF110, which is why the GF110 520mm^2 die was about the same in perf/watt efficiency as the GF114 330mm^2 die. In other words, as a gaming chip it will have quite a bit of wasted die space, but since it will be performing the exact same calculations in games as GK104 does now, it should perform similar in perf/watt.
This made a lot of sense, i can see perf/w being good at expense of perf/mm2.
__________________

Rig 1: 3570K | Z77 E4 | Crossfire R290 | 840 250GB + 840EVO 250GB | 8GB G.Skill Ares 2133 | OCZ 850W Gold+ | Nanoxia DS1 | Ghetto Water
Hobby: Mobile Game Dev & Cryptocoin day-trader
Silverforce11 is offline   Reply With Quote
Old 11-12-2012, 03:21 PM   #159
boxleitnerb
Platinum Member
 
Join Date: Nov 2011
Posts: 2,534
Default

Add to that, that a larger GPU with lower frequencies is inherently more efficient.
boxleitnerb is offline   Reply With Quote
Old 11-12-2012, 04:40 PM   #160
nforce4max
Member
 
Join Date: Oct 2012
Posts: 88
Default

Just design the card with three 8pin 150w connectors and slap it with a decent tri slot cooler...
nforce4max is offline   Reply With Quote
Old 11-12-2012, 05:26 PM   #161
ShintaiDK
Lifer
 
ShintaiDK's Avatar
 
Join Date: Apr 2012
Location: Copenhagen
Posts: 11,606
Default

Its interesting to see the HPC battle now.

In the 225W TDP ballpark.
Tesla K20, 706Mhz, 1.17 peak DP Tflops, 5GB@208GB/sec.
Xeon Phi 5110P 1053Mhz, 1.01 peak DP Tflops, 8GB@320GB/sec.
__________________
Quote:
Originally Posted by Idontcare
Competition is good at driving the pace of innovation, but it is an inefficient mechanism (R&D expenditures summed across a given industry) for generating the innovation.
i5 4670, EVGA z87 Stinger, EVGA GTX980 SC, Ballistix Sport VLP 2x8GB, M500 480GB SSD, SG08B mITX.
ShintaiDK is offline   Reply With Quote
Old 11-12-2012, 05:38 PM   #162
3DVagabond
Diamond Member
 
Join Date: Aug 2009
Location: Christchurch, NZ
Posts: 8,599
Default

Well, back to the title, Charlie was right, the K20 is 13SMX. When questioned about ORNL's being 14SMX he said that they could have a different model, which too is correct. They have the K20X.

Is this right, or have I missed something?
3DVagabond is offline   Reply With Quote
Old 11-12-2012, 05:43 PM   #163
sontin
Platinum Member
 
Join Date: Sep 2011
Posts: 2,256
Default

Yeah, you missed the point where he called the K20 the "big compute version of GK110":
http://semiaccurate.com/2012/11/02/n...t-28nm-yields/

Really, the guy has no information. He knows nothing.
sontin is offline   Reply With Quote
Old 11-12-2012, 05:53 PM   #164
notty22
Diamond Member
 
notty22's Avatar
 
Join Date: Jan 2010
Location: Beantown
Posts: 3,332
Default

In this quote, Charlie states 12,13,14,15 . How could he not get the right answer?

Of course they could have some special chips to them look better (than they are?).

Quote:
In fact, they could possibly pull out a sub-bin of low power chips that would make them look better. If there are enough parts for a 14SMX consumer part, a 15 SMX K20 should be an easy one to make. You can see the pattern enough to know that a 13SMX compute card means yields support a 12 or 13SMX consumer variant.
__________________
i5 4670K@4100mhz, 32GB Kingston 1600,H50
MSI GTX 970 gaming Seasonic SS-760XP2
Munchkin 240gb SSD, Win 10 Tech preview, Samsung U28D590D


Let's make sure history never forgets... the name... 'Enterprise'. Picard out.
notty22 is offline   Reply With Quote
Old 11-12-2012, 05:56 PM   #165
3DVagabond
Diamond Member
 
Join Date: Aug 2009
Location: Christchurch, NZ
Posts: 8,599
Default

Quote:
Originally Posted by sontin View Post
Yeah, you missed the point where he called the K20 the "big compute version of GK110":
http://semiaccurate.com/2012/11/02/n...t-28nm-yields/

Really, the guy has no information. He knows nothing.
This is what I was talking about. He was right that the K20 is 13SMX. When confronted with the reported specs for Titan's cards, he stated that it might have higher binned cards, which it does.

Quote:
Originally Posted by charlie
Did you ever consider that supercomputers may get parts that are not the same as you can buy off the shelf? Think about what that might mean in the context of poor yields. Hint: If a company suddenly needs 20K units at one bin above where things are yielding, and is contractually obliged to do it after screwing the pooch on the last generation, what might the result look like?

Discuss.

-Charlie
Disregarding his diatribe about yields (which we have no way to confirm) him saying that K20 is 13SMX is accurate. Hate on him as much as you want to when he's wrong. Hating on him and saying he knows nothing when he was actually correct, just makes your statement biased.
3DVagabond is offline   Reply With Quote
Old 11-12-2012, 06:03 PM   #166
sontin
Platinum Member
 
Join Date: Sep 2011
Posts: 2,256
Default

He was not right. He quoted heise.de which got their information from a swiss reseller of workstation stuff.
That forum post is nonsense, too. He said that after the information was leaked by the HPC site and Ryan Smith.

Funny that someone is supporting him after he showed that he had no informations.
sontin is offline   Reply With Quote
Old 11-12-2012, 06:06 PM   #167
iMacmatician
Member
 
iMacmatician's Avatar
 
Join Date: Oct 2012
Location: United States of America
Posts: 88
Default

Is the K20X something "you can buy off the shelf" like the K20? If so then he wasn't right.
iMacmatician is offline   Reply With Quote
Old 11-12-2012, 06:13 PM   #168
3DVagabond
Diamond Member
 
Join Date: Aug 2009
Location: Christchurch, NZ
Posts: 8,599
Default

Quote:
Originally Posted by iMacmatician View Post
Is the K20X something "you can buy off the shelf" like the K20? If so then he wasn't right.
His article was about the K20. His response about the "K20X" was to explain why ORNL's specs were higher. He was just saying that the report of the K20 being 13SMX was correct and that ORNL being 14SMX would mean they had different cards. He obviously had no knowledge of the K20X, but was sticking with the K20 being 13SMX.
3DVagabond is offline   Reply With Quote
Old 11-12-2012, 06:19 PM   #169
ShintaiDK
Lifer
 
ShintaiDK's Avatar
 
Join Date: Apr 2012
Location: Copenhagen
Posts: 11,606
Default

Its easy to be "right" when you make articles about almost all possible outcome. And then you can always point to the one that actually hit and say; "I told you so!".

Reading SA is like readinga gossip magazine about celebrities.

Utter BS and crap to make stupid people keep clicking his site.
__________________
Quote:
Originally Posted by Idontcare
Competition is good at driving the pace of innovation, but it is an inefficient mechanism (R&D expenditures summed across a given industry) for generating the innovation.
i5 4670, EVGA z87 Stinger, EVGA GTX980 SC, Ballistix Sport VLP 2x8GB, M500 480GB SSD, SG08B mITX.
ShintaiDK is offline   Reply With Quote
Old 11-12-2012, 06:32 PM   #170
SickBeast
Lifer
 
SickBeast's Avatar
 
Join Date: Jul 2000
Posts: 14,299
Default

What happened to semiaccurate.com? Their site has been down for several weeks now.
SickBeast is offline   Reply With Quote
Old 11-12-2012, 06:36 PM   #171
iMacmatician
Member
 
iMacmatician's Avatar
 
Join Date: Oct 2012
Location: United States of America
Posts: 88
Default

I just checked and I can load it fine.
iMacmatician is offline   Reply With Quote
Old 11-12-2012, 07:22 PM   #172
ViRGE
Super Moderator
Elite Member
 
ViRGE's Avatar
 
Join Date: Oct 1999
Posts: 30,343
Default

Quote:
Originally Posted by 3DVagabond View Post
He obviously had no knowledge of the K20X, but was sticking with the K20 being 13SMX.
And that's why so few people are considering him "right" in this case. If he had any inside information he would have known about K20X. Instead he's relying on other sources, and because of his one-sided feud with NVIDIA, parrots the worst case numbers.
__________________
ViRGE
Team Anandtech: Assimilating a computer near you!
GameStop - An upscale specialized pawnshop that happens to sell new games on the side
Todd the Wraith: On Fruit Bowls - I hope they prove [to be] as delicious as the farmers who grew them
ViRGE is offline   Reply With Quote
Old 11-12-2012, 07:45 PM   #173
3DVagabond
Diamond Member
 
Join Date: Aug 2009
Location: Christchurch, NZ
Posts: 8,599
Default

Quote:
Originally Posted by ViRGE View Post
And that's why so few people are considering him "right" in this case. If he had any inside information he would have known about K20X. Instead he's relying on other sources, and because of his one-sided feud with NVIDIA, parrots the worst case numbers.
Charlie made no claim to personal sources. He obviously had confidence in the reference article. That aside, what I'm talking about is those who said he was wrong reporting that K20 was 13SMX, when in reality it is 13SMX.

The other reason "why so few people are considering him "right"" is that many of them would say that regardless of anything. They just say it purely to discredit what he writes. I'm merely pointing out that 13SMX for the K20 is correct and the cards that ORNL has are 14SMX because they are a different card. Not because Charlie was wrong. Which is what many assumed.

As far as anything else he wrote, we'll have to wait and see. It could all be BS. We could never even see a Geforce K110, never mind it being 12 or 13 or 14 or 15 SMX. I would be surprised though if we didn't see one and it wasn't 13SMX. As I said, time will tell.
3DVagabond is offline   Reply With Quote
Old 11-12-2012, 11:25 PM   #174
Ajay
Platinum Member
 
Ajay's Avatar
 
Join Date: Jan 2001
Location: NH, USA
Posts: 2,174
Default

Quote:
Originally Posted by notty22 View Post
In this quote, Charlie states 12,13,14,15 ** How could he not get the right answer?

Quote:
Quote:
In fact, they could possibly pull out a sub-bin of low power chips that would make them look better** If there are enough parts for a 14SMX consumer part, a 15 SMX K20 should be an easy one to make** You can see the pattern enough to know that a 13SMX ***pute card means yields support a 12 or 13SMX consumer variant**
Of course they could have some special chips to them look better (than they are?)**
But that's ***pletely backwards** If NV makes a consumer GK110 AIB, they will be able to use more SMXs and/or higher clocks** Power requirements, duty cycles, reliability, etc** are less constraining in consumer cards than in those intended for professional markets**


Edit:What the heck is going on with all the stars in the text???
__________________
Asus P6T V2 Deluxe Ci7 970 @ 4.2GHz w/HT, Corsair H100i, 2x240GB SanDisk Extreme RAID0, 2x WD VR 300GB RAID0, MSI GTX 680 PE @ 1110MHz, 12GB G.Skill Ripjaws DDR3 1600, Corair 850HX, Corsair 800D case. Win7 x64 Ultimate. Dell U2412M.
Heatware

Last edited by Ajay; 11-13-2012 at 12:48 PM.
Ajay is offline   Reply With Quote
Old 11-12-2012, 11:35 PM   #175
blastingcap
Diamond Member
 
blastingcap's Avatar
 
Join Date: Sep 2010
Posts: 5,893
Default

Anandtech's K20/K20X review is up: http://www.anandtech.com/show/6446/n...rrives-at-last

14 SMX for K20X, 13 SMX for K20.
__________________
Quote:
Originally Posted by BoFox View Post
We had to suffer polygonal boobs for a decade because of selfish corporate reasons.
Main: 3570K + R9 290 + 16GB 1866 + AsRock Extreme4 Z77 + Eyefinity 5760x1080 eIPS
blastingcap is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 09:39 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.