Charlie D Claim Watch

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

M0RPH

Diamond Member
Dec 7, 2003
3,305
1
0
Seems to me he knows a lot more than most of you guys who like to say he's full of shit at every opportunity.
 

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
This threads existence makes me laugh. Arguing about the accuracy of a rumor after the fact? o_O Makes me think of this.

If the rumours are generally pretty accurate (it seems like they for the most part actually are) then it means that as a source he is not terrible.

It's not discussing the accuracy of rumours so much as ascertaining the value of S|A as a source of information for future reference, which is done by working out if his past claim history has a reasonable degree of accuracy or not.

The other option is to continue as usual and have people argue that he's always full of shit while others say he's reporting facts.
Or do you not like testing your sources for accuracy?
 
Feb 19, 2009
10,457
10
76
He's got some good contacts with NV employees aka "moles", this i'm certain off.

The reason the TDP numbers change is because originally a 512sp part on A2 at 500mhz will pull around 280W. Now on A3, at 480sp and 700mhz, nv claims its 250W but clearly it isn't as its got a higher power draw than a 5970 which is 295W. Clockspeed was always a problem, so to up it, they had to really increase the amps which lead to massive TDP.

The moral of this episode, do not trust NV spin ever.

Is it big, power hungry, hot and unmanufacterable? Given the definition in terms of silicon production, a yield of less than 20% is utter failure. So yeah, he was spot on. Even his performance prediction of 10-15% faster in games. Funny enough, some games there is no difference or the 480 performing worse. I mean why is it same or worse in new games? Dirt 2 in dx11 mode, BF:BC2, Stalker:CoP, Arma 2, Anno 1404, Need for Speed shift... these are pretty big titles. Even in crysis its about even.
 

thedosbox

Senior member
Oct 16, 2009
961
0
0
If the rumours are generally pretty accurate (it seems like they for the most part actually are) then it means that as a source he is not terrible.

It's not discussing the accuracy of rumours so much as ascertaining the value of S|A as a source of information for future reference, which is done by working out if his past claim history has a reasonable degree of accuracy or not.

The other option is to continue as usual and have people argue that he's always full of shit while others say he's reporting facts.
Or do you not like testing your sources for accuracy?

LOL - I like to stick with facts, not rumors, so these types of sites are just fluff. Some of you take this nonsense waaaay too seriously.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
So, what I'm getting from both sides, is that:

1. Anything can mean anything.
2. Anything is up for interpretation.
3. Nothing is indisputable and therefore a resolution can never be achieved.
4. We should all be gaming. :D
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
So, what I'm getting from both sides, is that:

1. Anything can mean anything.
2. Anything is up for interpretation.
3. Nothing is indisputable and therefore a resolution can never be achieved.
4. We should all be gaming. :D

If you read or have read Charlie article you can see very clearly what he means.

There is also a difference and a reason Charlie runs a rumour site and not a "respectable site" - he takes more chances, meaning he will be wrong sometimes or even most of the times but on the reverse side he can also come with stories before most others.

Actually I bet if I removed all the NVIDIA hate sentences from Charlie articles and toned down the article titles your opinion would be different.

For example "Nvidia's Fermi GTX480 is broken and unfixable" -> "Nvidia's is having problems getting out a GTX480 with 512 shaders and target TDP subtitle: Low yields and hot".

Means the same but with less emotion - less love and less hate.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
If you read or have read Charlie article you can see very clearly what he means.

There is also a difference and a reason Charlie runs a rumour site and not a "respectable site" - he takes more chances, meaning he will be wrong sometimes or even most of the times but on the reverse side he can also come with stories before most others.

Actually I bet if I removed all the NVIDIA hate sentences from Charlie articles and toned down the article titles your opinion would be different.

For example "Nvidia's Fermi GTX480 is broken and unfixable" -> "Nvidia's is having problems getting out a GTX480 with 512 shaders and target TDP subtitle: Low yields and hot".

Means the same but with less emotion - less love and less hate.

So, like I said. For example, what might be as clear as glass to you, won't be for another. Open to all interpretation, nothing is indisputable, and nothing gets achieved.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
So, like I said. For example, what might be as clear as glass to you, won't be for another. Open to all interpretation, nothing is indisputable, and nothing gets achieved.
That only depends how much your green-tinted sunglasses and blinders block out. I knew months ago that Fermi was going to be a unimpressive, largely from rumor sites like Semi-Accurate. I then waited for a good deal on another high-end setup and settled on 5850 CF, which was verified a major home run after Fermi released.
 

ginfest

Golden Member
Feb 22, 2000
1,927
3
81
That only depends how much your green-tinted sunglasses and blinders block out. I knew months ago that Fermi was going to be a unimpressive, largely from rumor sites like Semi-Accurate. I then waited for a good deal on another high-end setup and settled on 5850 CF, which was verified a major home run after Fermi released.


Wow , so you "knew" for a certainty that Fermi would turn out the way it has? Just by putting on your "red-tinted" glasses and reading a anti-NV slanted website? ;)
Amazing precognition there :)
Of course, some say that hindsight is 20/20 vision!
 

yh125d

Diamond Member
Dec 23, 2006
6,907
0
76
Wow , so you "knew" for a certainty that Fermi would turn out the way it has? Just by putting on your "red-tinted" glasses and reading a anti-NV slanted website? ;)
Amazing precognition there :)
Of course, some say that hindsight is 20/20 vision!

Hindsight is 20/20 doesn't really apply here. Just about everybody expected fermi to be underwhelming performance wise, power hungry, hot, and possibly expensive, and was for 3 months beforehand. The only applicable hindsight remark would be "I guess in hindsight all those people expecting fermi to be fairly poor were right (not only fanboys said that), and the nvidia guys who somehow still expected it to be amazing were wrong"
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Wow , so you "knew" for a certainty that Fermi would turn out the way it has? Just by putting on your "red-tinted" glasses and reading a anti-NV slanted website? ;)
Amazing precognition there :)
Of course, some say that hindsight is 20/20 vision!
The rumors were there for months, where were you? On the surface everyone immediately should have known something was wrong when it was delayed for six months. Kyle at HardOCP said we should expect a March release last October. A product isn't delayed that long "to increase the awesome factor;" it's delayed that long because something is seriously wrong with it. So the mission then was to find out what was so wrong with it.

You then add in the numerous sites reporting on Fermi's development (Semi-accurate was actually pretty good here), and it was all pretty clear. They were having yield problems - i.e. the chips the were getting weren't doing what they wanted (put basically). So why were they having yield problems? Well then the numbers came out regarding missed clock speeds and very high TDPs. Couple that in with the "fake" Fermi's shown as well as all the red tape thrown around and one could assume what the problem is - the new architecture was screwed and they were making some serious compromises in performance in order to get a practical part.

So I then assumed Fermi would be hot (and therefore loud), and performance would not be that impressive (and way below the "Fermi will destroy the 5870" nonsense that was said at the 58xx series release; just goes to show you they weren't even making actual parts at the time). Then came rumors of $500+ prices as well as all the last minute work (constant clock changes, updates, BIOS changes, etc.), and it was pretty obvious they were just barely getting by on this one. Turns out I was right, but more so this is a victory for the "rumor sites" as they were pretty spot on this time around.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Turns out I was right, but more so this is a victory for the "rumor sites" as they were pretty spot on this time around.

Looking back over everything Charlie said, did he get a single thing right? If he did, I must have missed it.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
What did he get wrong?

448 shaders for top bin part
600MHZ for top bin part
Top bin part will have 5K units total and no more will ever be produced
Performance level of the 480 was wrong(margin was 2x-3x higher then his lies)
Initial tape out was wrong(which actually makes nV look worse, but yet another thing he was just telling lies about)


Pretty much he just spewed a bunch of crap out going everywhere from 512SPs at 750MHZ to 448SPs and 500MHZ clocks and figured one of his fabrications would end up being right.

Read his latest article. Turns out that the 5870 is slow and that the yields for it were so low it was not worth launching(AMD had stated that the 58xx parts were getting 40% yields at launch, 62.5% is barely worth launching in Charlie's reality). Take him as accurate if you will, AMD was stupid to launch their slow parts according to Charlie too. To say he is a fool is putting it mildly, but if you think of him as insightful keep in mind that he is stating on the record that AMD is stupid and slow now :)
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
448 shaders for top bin part
600MHZ for top bin part
Top bin part will have 5K units total and no more will ever be produced
Performance level of the 480 was wrong(margin was 2x-3x higher then his lies)
Initial tape out was wrong(which actually makes nV look worse, but yet another thing he was just telling lies about)


Pretty much he just spewed a bunch of crap out going everywhere from 512SPs at 750MHZ to 448SPs and 500MHZ clocks and figured one of his fabrications would end up being right.
Go reread what I wrote again because you didn't understand a thing. He could have stated NVIDIA's new part was called "the X1Z FTW WAHHOOOOO" and you would post "see, he's wrong, it's called the GTX480, see there, that's another thing he got wrong" and I would still be laughing in your face like I am now.

He and other rumor sites got enough information out, that was relatively accurate. For instance, all those specifications, which were known to no one officially, are actually pretty damn close to what came out, and are in line with NVIDIA cutting down the GPU with each revision to get a feasible part. How do you know at one point that 448 shaders wasn't what NVIDIA was going for, but then had an engineering breakthrough later that allowed 480 shaders at a decent clock speed? You don't. I think it's funny that you, like some others on this forum, hold Charlie in such high regard that you treat his word as fact, and then are personally insulted when what he reported has changed (after all, it's just rumors), kind of ironic, huh?

Read his latest article. Turns out that the 5870 is slow and that the yields for it were so low it was not worth launching(AMD had stated that the 58xx parts were getting 40% yields at launch, 62.5% is barely worth launching in Charlie's reality). Take him as accurate if you will, AMD was stupid to launch their slow parts according to Charlie too. To say he is a fool is putting it mildly, but if you think of him as insightful keep in mind that he is stating on the record that AMD is stupid and slow now :)
Where's your link?
 

shangshang

Senior member
May 17, 2008
830
0
0
I'll bet Charlie has gained a lot of cred points already. I'll bet next time when Charlie speaks something of NV, people will pay a lot more attention than just call him BS anymore.

for this round:

Charlie - 1
Nvidia - 0
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Wow , so you "knew" for a certainty that Fermi would turn out the way it has? Just by putting on your "red-tinted" glasses and reading a anti-NV slanted website? ;)
Amazing precognition there :)
Of course, some say that hindsight is 20/20 vision!

I believe MK6. Anyone could have seen it coming if you followed the GPU game closely enough over the last few years. They were increasing the chip size every generation, ati broke down and went for smaller design and a dual gpu card as their top end, and they called it quits chasing the single gpu halo. Turned out to be a smart decision in the long run. People were calling g80 & gt200 big and hungry. There was no sign of change for gt300/gf100/fermi in this aspect: they were doubling everything again, and adding compute centric designs. They hoped 40nm would bring power & thermals down to suit the design they began in 2006.

Turned out to be an incorrect assumption. The size of the chip & power it requires to run stably are more in favor of cypress than gf100 on the current node.

You are correct that most people with a bad outlook for gf100 months back were wearing red-tinted glasses. Charlie being the number 1 dude in that aspect. I say he knew with certainty in his own opinion... sure. hahah.
 

yh125d

Diamond Member
Dec 23, 2006
6,907
0
76
448 shaders for top bin part
600MHZ for top bin part
Top bin part will have 5K units total and no more will ever be produced
Performance level of the 480 was wrong(margin was 2x-3x higher then his lies)
Initial tape out was wrong(which actually makes nV look worse, but yet another thing he was just telling lies about)


Pretty much he just spewed a bunch of crap out going everywhere from 512SPs at 750MHZ to 448SPs and 500MHZ clocks and figured one of his fabrications would end up being right.

Read his latest article. Turns out that the 5870 is slow and that the yields for it were so low it was not worth launching(AMD had stated that the 58xx parts were getting 40% yields at launch, 62.5% is barely worth launching in Charlie's reality). Take him as accurate if you will, AMD was stupid to launch their slow parts according to Charlie too. To say he is a fool is putting it mildly, but if you think of him as insightful keep in mind that he is stating on the record that AMD is stupid and slow now :)

448/600mhz who knows. People other than charlie have said that nvidia had been changing the final specs right up until launch almost. He could have been completely right when he reported that, we'll never know

yeah he was way off on 5k units

He said 480 was 5% faster when it's really 15%, but once again, that may have been completely accurate if they were doing a 512/600mhz part. They hadn't set it in stone yet.

He was off on tape out, but why do you care? I'd figure you'd be happy that he expected NV to do that well.


As for what has he gotten right, are you retarded? Read the OP. Oh right, you're just trolling. NVM
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I believe MK6. Anyone could have seen it coming if you followed the GPU game closely enough over the last few years. They were increasing the chip size every generation, ati broke down and went for smaller design and a dual gpu card as their top end, and they called it quits chasing the single gpu halo. Turned out to be a smart decision in the long run. People were calling g80 & gt200 big and hungry.

Actually the physical chip size is about the same as the last new nvidia architecture which was the G80. So using your argument comes with exactly the opposite result as the G80 was probably the most successful gpu of all time so hence fermi should be too.

As for Charlie - he was mostly wrong in pretty well all of his "dramatic" assertions, in that he took some information which was had a grain of truth, then made a load of rubbish up off the back of it. e.g. The eol of the GTX 275/285 meant nvidia were leaving high end market. The TMSC problems meant fermi was "unmanufacturable". The 280W power usage which he guestimated from the 448 shader part (not 512 as people keep claiming - read his article) and 600 core clock. The extra functionality for gpu compute which he said meant nvidia was abandoning gamers - remember how fermi wasn't meant to have any hardware tesselation. The only one that really panned out was how late fermi was which going by how far out everything else was seems more luck then judgement.
 

WhoBeDaPlaya

Diamond Member
Sep 15, 2000
7,414
401
126
LOL - I like to stick with facts, not rumors, so these types of sites are just fluff. Some of you take this nonsense waaaay too seriously.
When a manufacturer/designer doesn't volunteer the information and everyone's under an NDA (ie. can neither confirm nor deny), then everything is a rumour by definition genius ;)
I feel a little dirty partaking in the vid card forums, but just so no one labels me a fanboi, I'll say that I am a Christmas elf - red AND green :p
(X800GTO, X1900XT, 8800GTS, 8800GTX, 4850, 4890)
 
Last edited:

yh125d

Diamond Member
Dec 23, 2006
6,907
0
76
Actually the physical chip size is about the same as the last new nvidia architecture which was the G80. So using your argument comes with exactly the opposite result as the G80 was probably the most successful gpu of all time so hence fermi should be too.

As for Charlie - he was mostly wrong in pretty well all of his "dramatic" assertions, in that he took some information which was had a grain of truth, then made a load of rubbish up off the back of it. e.g. The eol of the GTX 275/285 meant nvidia were leaving high end market. The TMSC problems meant fermi was "unmanufacturable". The 280W power usage which he guestimated from the 448 shader part (not 512 as people keep claiming - read his article) and 600 core clock. The extra functionality for gpu compute which he said meant nvidia was abandoning gamers - remember how fermi wasn't meant to have any hardware tesselation. The only one that really panned out was how late fermi was which going by how far out everything else was seems more luck then judgement.

G80 was so successful because it was:

On time
Had no competition for years
When competition finally did arrive it had been shrunk into G92 which had plenty manageable power



Fermi is:

Late
Has strong competition for 5 months already
Has no time with no competition for respins/die shrinks to get power in check



Fermi could definitely lead to a very successful GPU, but not fermi itself. In the same vein as G80 being small potatoes compared to G92, NV can take the good architecture that is behind fermi and mold it into something great in the future which will do much better than the initial fermi.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
The 280W power usage which he guestimated from the 448 shader part (not 512 as people keep claiming - read his article) and 600 core clock.

Link please.

Nvidia castrates Fermi to 448SPs 21 December 2009

The rest however is Nvidia's fault. It designed a chip that was more or less unmanufacturable, and we have been saying as much for more than six months now. It is big, 530mm^2 at the minimum, likely 10+mm^2 more, and way too hot. The 448 SP version is listed at 225W TDP, 190W 'typical', and that is with two of the 16 shader clusters clusters fused off. With them on, it would likely be well above 250W, far too hot for a single GPU card.

Nvidia GF100 pulls 280W and is unmanufacturable 17 January 2010


The raw manufacturing cost of each GF100 to Nvidia is more than double that of ATI's Cypress. If the target product with 512 shaders is real, the recently reported 40 percent yield rates don't seem to be obtainable. It won't hit half of that based on Nvidia's current 40nm product yields, likely far far less.

Cost aside, the next problem is power. The demo cards at CES were pulling 280W for a single GPU which is perilously close to the 300W max for PCIe cards. Nvidia can choose to break that cap, but it would not be able to call the cards PCIe. OEMs really frown on such things. Knowingly selling out of spec parts puts a huge liability burden on their shoulders, and OEMs avoid that at all costs.

280W and 550mm^2 means Nvidia is maxed out on both power use and reticule area for any product from TSMC. There is precious little room to grow on either constraint. The competition on the other hand can grow its part by 60 percent in die area and over 50 percent in power draw while staying below what Nvidia is offering. That puts an upper bound on Nvidia's pricing in a fairly immutable way, barring a massive performance win. If you don't feel like reading to the end, the short story is that it didn't get that win.


Nvidia's Fermi GTX480 is broken and unfixable 17 February 2010

Fixing these problems requires Nvidia to do what ATI did for Evergreen, that is, double up on the vias and also change the circuits in a non-trivial way. This process requires a lot of engineering time, a base layer respin, and probably at least one metal spin on top of that. If everything goes perfectly, it will still be more than six months before it can bring a fix to market.

While this is bad for Nvidia, and likely terminal for Fermi GF100 as an economically viable chip, it does actually get worse. The chip is big and hot. Insiders have told SemiAccurate that the chips shown at CES consumed 280W. Nvidia knew that the GPU would consume a lot of power long before the chip ever taped out, but it probably thought it would be around the 225W mark claimed for the compute boards.

To combat this, Nvidia engineers tell SemiAccurate that the decision was made to run the chip at a very low voltage, 1.05V versus 1.15V for ATI's Cypress. Since ATI draws less power for Cypress, 188W TDP vs 225W TDP for the Fermi GF100, every time Nvidia needs to tweak the voltage of its card, that results in roughly 50 percent more amperage used for every .01V the core is raised by. While this is oversimplification, the take-home message is that Nvidia made choices that result in more power added than ATI if the voltages need to be upped.

SemiAccurate wrong about Nvidia 480GTX power use 12 March 2010

If you recall, the official story is that the card, in it's cut down and underclocked version, pulls 225W. That number, along with stunningly poor performance, has lead to some notable backpedaling. If that isn't bad enough, some sources at GDC told SemiAccurate that Nvidia jacked up the TDP by 50W last week without warning.

We will be the first to admit we were wrong about the TDPs of the cards. At CES we said the GTX480s shown there were pulling 280W, something Nvidia vehemently denied. Engineers beavering away at the things Dear Leader thinks are important, like the style of the wheels on his Ferrari, have been pulled off to work on cards for some unfathomable reason. Working hard, they have managed to reduce the TDP of the cards 5W to 275W. Yeah, Nvidia finally admitted that the card is the burning pig anyone who has used one knows it is.
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Go reread what I wrote again because you didn't understand a thing.

This is quite clearly black and white. Correct or not. Charlie was wrong, on pretty much everything(he was wrong on absolutely everything at least most of the time).

448/600mhz who knows. People other than charlie have said that nvidia had been changing the final specs right up until launch almost.

And yet Charlie knew better then nVidia- that is precisely my point. He was making things up, he had no real information- it was all fabricated. As you have clearly pointed out yourself, not even nVidia knew, so how the hell did Charlie have a mole who was supposed to have the information?

He was off on tape out, but why do you care? I'd figure you'd be happy that he expected NV to do that well.

I don't want or not want any company to perform in a particular fashion- Charlie is currently bashing AMD's performance too with his same level of ignorance just without his biased drivel as a side dish. This has nothing to do with wanting anyone to perform in a particular fashion- it is about the fact that Charlie is 100% Bullshit all the time.

No matter who stated what about Fermi, you will find that I didn't believe any of it, Charlie made it shockingly easy to see how wrong he was because he contradicted himself more then anyone else reasonably could.

As for what has he gotten right, are you retarded?

Which parts did he get right? That Fermi was going to be big? I'll tell you, noone could have ever figured that out when we knew it was going to be 3 billion transistors....

Where's your link?

had a yield of 62.5 percent, give or take a little, and that yield was considered so low that it was almost not worth launching.

GTX480 is slow, barely faster than an ATI HD5870.

http://www.semiaccurate.com/2010/03/28/why-nvidia-hacked-gtx480/

Taiwan Semiconductor Manufacturing Company (TSMC), the world's largest dedicated independent semiconductor foundry, recently confirmed it has run into new issues with their 40nm process technology that have sent yield rates down to 40%.

http://www.techspot.com/news/36781-tsmc-40nm-yield-issues-to-affect-amd-and-nvidia.html

According to Charlie the 5870 is slow and was unlaunchable- given your praise of him I can only assume that you agree.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
T
http://www.techspot.com/news/36781-tsmc-40nm-yield-issues-to-affect-amd-and-nvidia.html

According to Charlie the 5870 is slow and was unlaunchable- given your praise of him I can only assume that you agree.

Of course the 5870 is much smaller and so cheaper to produce, consumes a lot less power, as much I remember the TSMC yield issues with Cypress involved some machinery calibration problems and have since been solved, Cypress has been out for 6 months now.

So it isn't black and white as you paint it.

Regardless him being a jackass regarding NVIDIA (and is mutual) the fact is that either he was extremely lucky or he did in fact have some reasonable reliable sources.

The fact is that if he had failed this Fermi business people would rightly ignore him next time regarding NVIDIA.

But he seems to be right - either by chance in your opinion or simply due to solid sources of information - so some people won't disregard future news about NVIDIA.