[SemiAccurate] Nvidia's Fermi GTX480 is broken and unfixable

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I know this was already mentioned, but Fudzilla is reporting a different outlook on fermi than Charlie is.

GTX 480 to be as hot as GTX 285: http://www.fudzilla.com/content/view/17723/1/

and nvidia says they're happy with TSMC's yields: http://www.fudzilla.com/content/view/17731/1/

So as hot as the GTX 285 but more TDP? Remember a GTX 280 was hotter and had more tdp than the GTX 285, so in my opinion it sounds like Fermi is on a better foundation than the GT200 was at it's initial release.
 

DefRef

Diamond Member
Nov 9, 2000
4,041
1
81
This thing reads like the ATI PR department wrote it because they probably did. Chuckles should promise to go away if he's lying. I'm betting he wouldn't make that promise since his job is to shill for his masters.
 

Schmide

Diamond Member
Mar 7, 2002
5,791
1,097
126
This thing reads like the ATI PR department wrote it because they probably did. Chuckles should promise to go away if he's lying. I'm betting he wouldn't make that promise since his job is to shill for his masters.

Lame. Give ATI's PR department a bit more credit. They're way more conservative than Charlie. They may put forth a jab here and there along with some favorable analysis, but they know when to give credit and when to bite their tongue.

All these conspiracy theories are getting a bit long in the tooth. FFS people take the article for what it is, a tabloid of conjecture on what's going on when missteps are made. Talk about what it says and forget about the messenger.
 
Last edited:

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
The one thing I've learned from watching ATI/Nvidia over the years is that when either company has a genuinely great product in development that is running along smoothly they take every opportunity to show it off. They don't keep it hidden from the press.

The fact that Nvidia has been so hush hush about this product likely means they are having major issues.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
all that he has is old info stating that yields suck, but he extrapolated an entire article out of it. are his arguments compelling? yes, and they could very well be true, but I for one think that rumors of jhh's imminent demise are greatly exaggerated.

nvidia has a good track record of producing the goods under pressure. is every day with no new info a bad omen? sure it is, but keep in mind that nvidia still has a LOT more money to spend on graphics r & d than amd. even if fermi absolutely blows nvidia will probably still end up around 50% of the mid/high end for 2010, and they have enough fans that many will wait for 6xxx and nvidia's next gen before making a decision. If nvidia pulls an amd cpu division-type fumble and cranks out 3 or 4 gens that suck then I'll start getting worried.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
nvidia has a good track record of producing the goods under pressure. is every day with no new info a bad omen? sure it is, but keep in mind that nvidia still has a LOT more money to spend on graphics r & d than amd. even if fermi absolutely blows nvidia will probably still end up around 50% of the mid/high end for 2010, and they have enough fans that many will wait for 6xxx and nvidia's next gen before making a decision. If nvidia pulls an amd cpu division-type fumble and cranks out 3 or 4 gens that suck then I'll start getting worried.

I have read Nvidia makes quite a good profit from the HPC market.

How much presence does ATI have in High performance computing?
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Quote:
Originally Posted by CurseTheSky View Post
G100 is already late. Rumors are the chip is going to be large, hot, power hungry, and absolutely amazing performance wise.
You mean like the GTX 280? IMO that turned out okay.

yes, but it launched BEFORE 4xxx not, 7 + months later
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
The one thing I've learned from watching ATI/Nvidia over the years is that when either company has a genuinely great product in development that is running along smoothly they take every opportunity to show it off. They don't keep it hidden from the press.

The fact that Nvidia has been so hush hush about this product likely means they are having major issues.

Where were you at the G80 launch? :colbert:
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
You mean like the GTX 280? IMO that turned out okay.

Well GTX 280 wasn't 55nm. It was 65nm AFAIK.

that's the problem, neither nvidia nor tsmc was ready to pump out such a huge chip on 40nm ...they should have launched at 55nm then later gone to 40nm.

In fact, this is probably what would have happened if nvidia had tried to launch gt200 at 55nm: they came out with gt200b 6-7 mos later with slightly higher clocks. Of course, tsmc was not having the major problems at 55nm that they now have on 40nm, so this is probably worse. it looks like the amd cpu division's long-term focus on manufacturing excellence is starting to pay off in a big way.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
If we assume this article is true and NV puts out a very limited run of Fermi's that end up being extremely hot and power hungry, then what?

For one, NV won't fold and go away. Brother Jen has far to big of an ego to not regroup and come back hard. You will probably see some internal process changes at NV to try to keep this from happening again, and a redoubling of effort on the next gen or respin. In a way, a disaster with Fermi could be a good thing in the long term.

ATI's 5000 series cards are good, and eyefinity is pretty cool, but it is only a evolutionary step. Fermi architecture, from what we know about it, is more of a revolutionary step and ATI better be taking notes. When (not if) the Fermi architcture and design philosophy actually hit the street in significant numbers, it has the potential to really be the leader.
 

gorobei

Diamond Member
Jan 7, 2007
4,116
1,622
136
fermi may or may not be revolutionary in architecture. the main thing they've done to distinguish themselves on dx11, is to push tessellation. but the way they've implement it is to give each shader block it's own tessellate module. this is a fine way to boost performance, but not if it sacrifices shader compute power that would normally go to something else. If it is only impacting blocks that wouldn't be doing anything else at time, then yes it's a nice boost. If it occupies blocks that would be doing postproc/AA/AF/displacement then it is parasitic.

we won't know until a true dx11 only game comes out. and that probably wont be for years.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
If we assume this article is true and NV puts out a very limited run of Fermi's that end up being extremely hot and power hungry, then what?

For one, NV won't fold and go away. Brother Jen has far to big of an ego to not regroup and come back hard. You will probably see some internal process changes at NV to try to keep this from happening again, and a redoubling of effort on the next gen or respin. In a way, a disaster with Fermi could be a good thing in the long term.

ATI's 5000 series cards are good, and eyefinity is pretty cool, but it is only a evolutionary step. Fermi architecture, from what we know about it, is more of a revolutionary step and ATI better be taking notes. When (not if) the Fermi architcture and design philosophy actually hit the street in significant numbers, it has the potential to really be the leader.

To be honest, this Fermi 'launch' really reminds me of the R600 launch. A product that had a whole lot of revolutionary features (such as the 512 bit ring bus, tesselator, and others)on a new process that was really late had to be clocked down heavily and ended up being far slower than its competition and consumed more power and cost more to produce. The successor (HD 3870) cut back on some of those features, but was far more competitive. The third generation of that architecture really shown brightly though (HD 4870).
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
If we assume this article is true and NV puts out a very limited run of Fermi's that end up being extremely hot and power hungry, then what?

For one, NV won't fold and go away. Brother Jen has far to big of an ego to not regroup and come back hard. You will probably see some internal process changes at NV to try to keep this from happening again, and a redoubling of effort on the next gen or respin. In a way, a disaster with Fermi could be a good thing in the long term.

ATI's 5000 series cards are good, and eyefinity is pretty cool, but it is only a evolutionary step. Fermi architecture, from what we know about it, is more of a revolutionary step and ATI better be taking notes. When (not if) the Fermi architcture and design philosophy actually hit the street in significant numbers, it has the potential to really be the leader.

If AMD's earlier GPU's never had a tesselator, only supported DX10, used DDR3, and AMD never built a 40nm part before, than the 58xx parts might seem more revolutionary as well. But, AMD did their leg work with earlier designs so their 58xx part seems more evolutionary to their prior parts. But, that's just because AMD took those steps as they went along. Nvidia has to take a lot of these steps at once.
 

KIAman

Diamond Member
Mar 7, 2001
3,342
23
81
And, in all the time Fermi is being delayed, I doubt ATI is sitting complacent with their current offerings. Better believe they are working feverishly on their next generation GPU.

If NVidia does fook Fermi up, they will bounce back in no time. They already survived their FX disaster and even made profit. That's the power of their marketing.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
I feel like compute is going to kill nvidia. If nvidia made cards strictly aimed at gaming, they might be able to get away with a competitive 55nm product, but going with such a heavy compute focus guarantees they're going to be at a significant lower density in performance per mm.

Hope they have a g98 waiting in the wings (say a g92 with doubled shaders or tripled shaders) or they're not going to have much to sell.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
This thing reads like the ATI PR department wrote it because they probably did. Chuckles should promise to go away if he's lying. I'm betting he wouldn't make that promise since his job is to shill for his masters.

What value is there in a promise from a liar?
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
If AMD's earlier GPU's never had a tesselator, only supported DX10, used DDR3, and AMD never built a 40nm part before, than the 58xx parts might seem more revolutionary as well. But, AMD did their leg work with earlier designs so their 58xx part seems more evolutionary to their prior parts. But, that's just because AMD took those steps as they went along. Nvidia has to take a lot of these steps at once.
DX11, 40nm, etc were the obvious next steps. I'm actually referring more to the GPGPU part of it. NV's push to make the GPU usable for more then video, and working with developers to use it, is a break from the past (although it should have happened a long time ago). Of course, given that AMD/Intel are going for integrated CPU/GPU, I guess they had no choice.

Still, my point is that even if Fermi is a dud, NV will come back and they will come back strong. NV and ATI both have smart, dedicated people.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
The one thing I've learned from watching ATI/Nvidia over the years is that when either company has a genuinely great product in development that is running along smoothly they take every opportunity to show it off. They don't keep it hidden from the press.

The fact that Nvidia has been so hush hush about this product likely means they are having major issues.

The same thing happened with the R600 and the NV30.

that's the problem, neither nvidia nor tsmc was ready to pump out such a huge chip on 40nm ...they should have launched at 55nm then later gone to 40nm.

I agree with you, I think that even if the die reached 800mm2 in size, it would be more feasible to make, cheaper and would have better yields that it currently has with the buggy 40nm. They didn't do their job to counter attack the problems that the TSMC 40nm brought like the variable transistors.

fermi may or may not be revolutionary in architecture. the main thing they've done to distinguish themselves on dx11, is to push tessellation. but the way they've implement it is to give each shader block it's own tessellate module. this is a fine way to boost performance, but not if it sacrifices shader compute power that would normally go to something else. If it is only impacting blocks that wouldn't be doing anything else at time, then yes it's a nice boost. If it occupies blocks that would be doing postproc/AA/AF/displacement then it is parasitic.

we won't know until a true dx11 only game comes out. and that probably wont be for years.

I agree with you, I heard that shader power would be sacrifice when tesselation was used since the stream block that processing it wouldn't be able to do anything else during the operation. I think that they did it for the sake of die space saving.

If AMD's earlier GPU's never had a tesselator, only supported DX10, used DDR3, and AMD never built a 40nm part before, than the 58xx parts might seem more revolutionary as well. But, AMD did their leg work with earlier designs so their 58xx part seems more evolutionary to their prior parts. But, that's just because AMD took those steps as they went along. Nvidia has to take a lot of these steps at once.

Like the Tesselator, DX10.1, 40nm experiment aka HD 4770, GDDR5, everything in steps.

They make good profit from the HPC market (e.g. the margins are MUCH better), but overall it's just a niche at the moment. I think I've read something like 2-4%.

It might be a measly 4%, but every card is sold for three times or higher in price than it's manufacturing costs.

DX11, 40nm, etc were the obvious next steps. I'm actually referring more to the GPGPU part of it. NV's push to make the GPU usable for more then video, and working with developers to use it, is a break from the past (although it should have happened a long time ago). Of course, given that AMD/Intel are going for integrated CPU/GPU, I guess they had no choice.

Still, my point is that even if Fermi is a dud, NV will come back and they will come back strong. NV and ATI both have smart, dedicated people.

Probably if ATi didn't concentrate on the small die strategy, it would had done better in regards of GPGPU than they did now, but since they have a CPU division, they have the concept of GPU working together with the CPU, since there's some tasks that are better done in the CPU and others in the GPU, nVidia lacks of a CPU division, so they will try harder to diminish the importance of the CPU and will work harder to boost GPGPU performance, specially with their rivalry with Intel.
 
Last edited:

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
DX11, 40nm, etc were the obvious next steps. I'm actually referring more to the GPGPU part of it. NV's push to make the GPU usable for more then video, and working with developers to use it, is a break from the past (although it should have happened a long time ago). Of course, given that AMD/Intel are going for integrated CPU/GPU, I guess they had no choice.

Still, my point is that even if Fermi is a dud, NV will come back and they will come back strong. NV and ATI both have smart, dedicated people.

Thing is, Apple and OpenCL will do more for compute than nvidia. Meaning that the baseline for compute capable hardware is g80, ati probably doesn't even have any plans for something like fermi, but I guess intel's larrabee could go head to head with fermi2 or 3. Still, that'll either be a CUDA versus C++ war, or OpenCL will advance to support higher levels of compute.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Charlie has done so well with the Nvidia hates me stuff, that fudo is now trying the ati hates me. Problem for fudo is ati isn't cooperating and continues to release new cards.

When nvidia finally gets fermi out the door in numbers, Charlie will have to find a new target.