• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Fuad admits Charlie was right about Fermi...

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
See my quote. You're accusing someone with something that was so far rather your signature feature here...

I don't have sigs enabled, explain an A1 stepping. You have three choices- tell the truth and admit Charlie is a liar/idiot, agree with Charlie and make yourself look like one, or dodge the question and let the action speak for itself.

Surprisingly enough you somehow always end up defending Nvidia

Really? Dig up the last time I told anyone to buy a GTX295 or GTX285 for that matter. I can link the 5850 reccomendations no problem. Best of luck to you btw 🙂
 
For the rest of you, check out his conversation here and decide if he knows what he's talking about: http://www.semiaccurate.com/forums/showpost.php?p=14429&postcount=47

My god Charlie can actually make for a good read when he abstains from all the overly-silly rhetoric that embellishes his NV-related articles on the frontpage.

That post he made there, sane and rational thinking framed with reasonable caveats in his if/then logic-tree.

If only he'd port more of that to his frontpage and leave out the vitriol...
 
My god Charlie can actually make for a good read when he abstains from all the overly-silly rhetoric that embellishes his NV-related articles on the frontpage.

That post he made there, sane and rational thinking framed with reasonable caveats in his if/then logic-tree.

If only he'd port more of that to his frontpage and leave out the vitriol...

That's my point. I don't like his idiotic frontpage pieces either (though I'm sure good part of it is a show, nothing else 😉) but hey, I can get pretty pissed on many things several times a week so I give him the chance and read his stuff - and I always get new info and most of the time he's dead on and about Fermi he was spot on so far.

BTW check out the entire thread about Rys' TR article , it's worth the read: http://www.semiaccurate.com/forums/showthread.php?p=14429#post14429
 
I don't have sigs enabled, explain an A1 stepping. You have three choices- tell the truth and admit Charlie is a liar/idiot, agree with Charlie and make yourself look like one, or dodge the question and let the action speak for itself.

For the last time and I closed this: I said he knows what he's talking about. You can try to counter it. Or not. However this kind of rather lame attempt to pass the ball doesn't get you a cookie...

Really? Dig up the last time I told anyone to buy a GTX295 or GTX285 for that matter. I can link the 5850 reccomendations no problem. Best of luck to you btw 🙂

I didn't say you are stupid 😀 - I said you are defending them even when it's a ridiculous thing to do.
 
My god Charlie can actually make for a good read when he abstains from all the overly-silly rhetoric that embellishes his NV-related articles on the frontpage.

That post he made there, sane and rational thinking framed with reasonable caveats in his if/then logic-tree.

If only he'd port more of that to his frontpage and leave out the vitriol...

This is exactly how I feel. He's actually reasonable on the forum, but his article style is abrasive. I don't actually read his articles any more, I read his forum post history - tends to be the same content in more depth, sooner, and as you said much more reasoned. And he corrects himself or explains previous mistakes (e.g. "no dual GT200" applied to a different project that was indeed cancelled, referring to two GTX280s on one chip. The GTX295 is basically dual 275s.)
 
Quote me where did I spread any BS.

If you cannot then you have the right to remain silent about me, pal.




So... what's your point actually?

Ooh, internet tough guy here



My point is that you and others put way too much thought and energy into reading what's essentially misinformation for entertainment
 
you could be just another non-native English-speaker like me...

My first language was Mandarin Chinese, but I dream in English now. :awe:

And he corrects himself or explains previous mistakes (e.g. "no dual GT200" applied to a different project that was indeed cancelled, referring to two GTX280s on one chip. The GTX295 is basically dual 275s.)

Sure, anyone can spout prophecy and then later go "oh, that's what I meant." Stating the obvious is another gem.

I believe that at the time all he said was "G200" without reference to manufacturing process or version or anything like that.

But... whatever. I think we can all pretty much agree on a few things:
Charlie is sometimes right.
Charlie is sometimes not right.
Charlie hates all things Nvidia.
Any thread about Charlie/Nvidia eventually self destructs.

:\
 
My first language was Mandarin Chinese, but I dream in English now. :awe:



Sure, anyone can spout prophecy and then later go "oh, that's what I meant." Stating the obvious is another gem.

I believe that at the time all he said was "G200" without reference to manufacturing process or version or anything like that.

But... whatever. I think we can all pretty much agree on a few things:
Charlie is sometimes right.
Charlie is sometimes not right.
Charlie hates all things Nvidia.
Any thread about Charlie/Nvidia eventually self destructs.

:\

That I can agree with. Lock it up!
 
+1. For a product that is delayed as much as it already has been, almost all will be forgiven as long as it comes out really great.

I'm starting to think Fermi is only going to be so-so; competitive, but not 8800GT amazing. I do think, however, that nvidia's evolution and future derivatives of Fermi will be blatantly superior products.
 
I'm starting to think Fermi is only going to be so-so; competitive, but not 8800GT amazing. I do think, however, that nvidia's evolution and future derivatives of Fermi will be blatantly superior products.

R600 all over again... The same story repeats itself, I wonder if both companies are in complot. NV30 was a failure and weak, NV40 fixed it and it shune, R600 was a failure and weak, RV770 fixed it and shune. R420 was an extension of an already strong R350 GPU, GT200 is an extension of an already strong G92 GPU. nVidia downplayed DX8.1, ATi downplayed SM3.0, nVidia downplayed DX10.1. nVidia had issues with IBM 130nm node, ATi had issues with the TSMC 90nm node, trading blows in different generations regarding AA/AF quality. I wonder...
 
I'm starting to think Fermi is only going to be so-so; competitive, but not 8800GT amazing. I do think, however, that nvidia's evolution and future derivatives of Fermi will be blatantly superior products.

This is actually what I want; a so-so Fermi. If Fermi has a decent performance advantage, I'm not sure we will see an aggressive price war. Nvidia will be more than happy to price Fermi above the 5870 and ATI won't have much reason to lower prices. This assumes both companies can move enough product at the higher price points. If Fermi underperforms compared the the 5XXX line, then we should have an exact repeat of the GT200/48XX pricing, only with the players reversed.
 
I'm starting to think Fermi is only going to be so-so; competitive, but not 8800GT amazing. I do think, however, that nvidia's evolution and future derivatives of Fermi will be blatantly superior products.

I too am wondering if Fermi is going to be a monster in the GPGPU arena, but be so-so for gaming. It just seems to me that's the direction Nvidia is heading. Of course we'll have to wait and see... maybe we'll all be pleasantly suprised, maybe we'll see a Radeon 2900XT type part all over again.
 
I too am wondering if Fermi is going to be a monster in the GPGPU arena, but be so-so for gaming. It just seems to me that's the direction Nvidia is heading. Of course we'll have to wait and see... maybe we'll all be pleasantly suprised, maybe we'll see a Radeon 2900XT type part all over again.

Well they've got 50% more xtors invested in the sucker versus Cypress...gonna be sad if it is just so-so for gaming.
 
Well they've got 50% more xtors invested in the sucker versus Cypress...gonna be sad if it is just so-so for gaming.
But that's the problem, right? They married a gaming card with a compute card, so to speak, and ended up with a card that so far promises to be a monster in the GPGPU scene, but not necessarily a kick-ass GPU. Or at least, it managed to make us a bit doubtful or slightly pessimistic about it.
 
But that's the problem, right? They married a gaming card with a compute card, so to speak, and ended up with a card that so far promises to be a monster in the GPGPU scene, but not necessarily a kick-ass GPU. Or at least, it managed to make us a bit doubtful or slightly pessimistic about it.

They are just building on what they did with the G80 and that may have been the greatest GPU launch of all time.
 
Last edited:
They are just building on what they did with the G80 and they may have been the greatest GPU launch of all time.
Oh, don't get me wrong, I've always been hopeful for Fermi, and I can't say I disagree with your assessment of G80. I'd rather have two GPU companies with strong offerings rather than a lopsided one, for the obvious price-war benefit of the former. I just wanted to point out from IDC's post that right now, without absolutely any gaming data and only GPGPU performance quotes, even if they have 50% more transistors than Cypress, it's not a guarantee of kick-ass performance simply because Fermi isn't designed with the same goal as Cypress. Cypress was designed to be a gaming card, and it used up pretty much all of its transistor budget for it. Fermi, on the other hand, was designed to be a gaming card AND a monster GPGPU card, and I just have to think a good part of its transistor budget was relegated to make it a much improved GPGPU card over the previous gen instead of being focused as a game card. So, if anything, it probably won't be just the number of transistors that will make the difference vs Cypress.

That's not to say it will fail. I don't even want to suggest that and clearly wasn't my intention when I made my last post.
 
Well they've got 50% more xtors invested in the sucker versus Cypress...gonna be sad if it is just so-so for gaming.

What I mean by 'so-so' is that it won't be the giant 5870 killer that many people expect. This is just my guess so to speak, just the vibe I get I guess you could say. I woudn't be suprised if it's faster than the 5870, but I just don't know that it'll really slaughter it in benchmarks so to speak.

Take a look at a GTX280, it has 1.4 billion transistors. http://www.anandtech.com/video/showdoc.aspx?i=3334

A Radeon 4870 has 956 million transistors. http://www.testfreaks.com/blog/review/diamond-ati-radeon-hd-4870-1024mb-gddr5-video-card-4870pe51g/

Sure the GTX280 is generally the faster card, but not by a huge margin. And in some games and some settings the 4870 even beats the GTX280... the GTX280 has ~46% more transistors.

I assume more transistors mean more power which means more heat and can end up affecting clock speed...? I'm sure you are much more knowlegeable on this subject than I am. 🙂

So anyway, that's kind of the vibe I get with Fermi, that a lot of what's being done with it is aimed more at HPC/GPGPU work than gaming.
 
Last edited:
...by posting pretty much the same news Charlie posted weeks ago:

\

Full post: http://www.fudzilla.com/content/view/16685/1/


You made me register because you sir are an idiot.

Lets take a look back:
http://www.semiaccurate.com/2009/08/18/nvidia-takes-huge-risk/

Charlies was posting his usual FUD, make a wild guess back in August.
Before any respins (which is the cause of Fermi's delay).

That means he made a wild guess...and got lucky.

But to say he knew (back then) only goes to show what an idiot you are.
And that your posts should be read even more carefull that the FUD charlie writes....due to the low i.Q. presented in those posts.

You (and Charlie) are FUD'sters...and not good for the indutry.

Good bye idiot.
 
You made me register because you sir are an idiot.
Lets take a look back: http://www.semiaccurate.com/2009/08/...kes-huge-risk/
Charlies was posting his usual FUD, make a wild guess back in August. Before any respins (which is the cause of Fermi's delay).
That means he made a wild guess...and got lucky.
But to say he knew (back then) only goes to show what an idiot you are. And that your posts should be read even more carefull that the FUD charlie writes....due to the low i.Q. presented in those posts.
You (and Charlie) are FUD'sters...and not good for the indutry.
Good bye idiot.
Wow, Charlie and his ilk (I assume Fuad and Theo?) cause a lot of hate to be flung around.

Sure the GTX280 is generally the faster card, but not by a huge margin. And in some games and some settings the 4870 even beats the GTX280... the GTX280 has ~46% more transistors. I assume more transistors mean more power which means more heat and can end up affecting clock speed...? I'm sure you are much more knowlegeable on this subject than I do. So anyway, that's kind of the vibe I get with Fermi, that a lot of what's being done with it is aimed more at HPC/GPGPU work than gaming.
We seem to be on the same page, exactly my sentiments. I hope we are both wrong, though.
 
You made me register because you sir are an idiot.

Lets take a look back:
http://www.semiaccurate.com/2009/08/18/nvidia-takes-huge-risk/

Charlies was posting his usual FUD, make a wild guess back in August.
Before any respins (which is the cause of Fermi's delay).

That means he made a wild guess...and got lucky.

But to say he knew (back then) only goes to show what an idiot you are.
And that your posts should be read even more carefull that the FUD charlie writes....due to the low i.Q. presented in those posts.

You (and Charlie) are FUD'sters...and not good for the indutry.

Good bye idiot.

I know there is a lot of Charlie hate, and probably rightfully so due to his obvious anti-Nvidia articles, but in the article you posted he does seem to have at least some reasonably thought out ideas to back up what he says. I don't know that you could call it a 'guess' as he does back up his reasoning. There is a lot of Charlie hate, but like him or not, he has been correct about Fermi so far it seems... we'll just have to wait and see how it plays out, what Nvidia launches with and when. No need to call someone an idiot because you don't agree with them about what is essentially a piece of vaporware at this point.
 
Agree, not impressive at all, merely saying that with him being so hateful of nVidia (how did that happen anyway, did Jen-Hsun Huang kill his dog?), one would expect him to be wrong almost all the time, just as you'd expect fanatics in any topic wouldn't be worth listening to. But Charlie, much as he hates nVidia, seems to get a couple of things right. He does lay on the hate pretty thick, and it is very distracting.

yes, but it is much better than his 10% success rate from several years ago. I think it's pretty obvious that he actually has a legit source or three, but sometimes he either has no info or his sources speculate just like the rest of us.
 
+1. For a product that is delayed as much as it already has been, almost all will be forgiven as long as it comes out really great.

Being right about a single subject (Fermi delay) doesn't excuse being wrong about other considerable subjects within the same time frame as well as historically. Charlie is like Fox News to feed people what they want to hear, not what is actually going on.

yes, because nbc, abc, and cbs are so clearly unbiased in their reporting.

Quote:
Originally Posted by T2k



Quote:
Originally Posted by Keysplayr
Provided above were three (3) recent examples of Charlie's stumblings

Ehhh? There wasn't any example, sorry.

As I said, Charlie was right all along the way, ever since he first mentioned the possibility of a protracted process, well into 2010.


Quote:
. And we won't know if the latest Fud article is accurate until we actually get a release date on Fermi. Reserve your judgement til then anyway.

Yeah, right: "Guys, no need to worry about Nvidia" - right, Focus Group Member?

dude, are you stupid? keys and zap are both capable of banning you, and I will be amazed if you don't get at LEAST a week for your last 2 posts.
 
Last edited:
Back
Top