how nvidia got into this mess

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DannyBoy

Diamond Member
Nov 27, 2002
8,820
2
81
www.danj.me
Originally posted by: Adul
Originally posted by: NFS4
Originally posted by: lostinohio
:brokenheart: I just bought a XFX geforce fx 5200. Are you guys telling my I just got scre--- without so much as a pat on the head or at least a beer?

The FX 5200 was junk even before this scandal.

now it is only good for...

mm its good for nothing actually except playing old games.

Like, pong :D
 

odog

Diamond Member
Oct 9, 1999
4,059
0
0
Originally posted by: NFS4
Originally posted by: lostinohio
:brokenheart: I just bought a XFX geforce fx 5200. Are you guys telling my I just got scre--- without so much as a pat on the head or at least a beer?

The FX 5200 was junk even before this scandal.

thats the bad thing. a gf4mx with uselessly slow dx9 performance:(

 

Goose77

Senior member
Aug 25, 2000
446
0
0
Originally posted by: PorBleemo
Because of...

A) ATI's amazing jump out of nowhere-land which turned them into a major force in the GPU market.
B) nVidia's cheating problems to try and get a lead (or level) with ATI.
C) nVidia's cheating discovered which put ATI in higher-grounds and put nVidia even lower.
D) People love ATI now, such as me. :)

-Por


a lot of people seem to have either forgotten or didnt know, but ATI has been around for a while and have been pretty strong in the industry. They just have not really focused on the gaming market until now. They are huge in OEM and really popular with the MAC peeps. They didnt rush the release of there products becuase they didnt have as much riding on it! Remember what helped drive 3DFX out, the Nvidia 6 month chip release. well that set a standard which nvidia couldnt keep! It seems as if they were afraid if they got behind it would cost them too much market share, something they didnt seem willing to give up! I have a feeling that it might have been a combination of things that caused this situation. Lucky for them that they have Processor chipsets to fall back on, as they are prospering very well in this market!
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
nVidia focused on the IEEE shader specs and what Carmack wanted for Doom3. MS adopted ATi's DX9 spec submission as did the ARB. That more then anything is how nVidia got where they are right now. I think a lot of people underestimate how long it takes to design a chip with over 100Million transistors, there is no way they could completely reengineer their part around specs they weren't expecting in a few months.

Think what would have happened if the DX9 spec came out and it required FP32, people would be talking a lot differently about the current situation. There is actually some good reasoning outside of the IEEE spec for using FP32 over FP24, when we see DX10 it will require FP32 for PS as it is required to run vertex shaders and they will share common resources under DX10. nVidia's real mistake with the NV3x line was the amount of registers they chose to include for utilization by the pixel shaders. The rest of what we are seeing is driver optimizations and/or bugs. If the shaders are fixed in the upcoming Dets v the betas and performance isn't down, then we will have some evidence as to which path they were going for.

Another big advantage ATi had was being the lead dev platform for an extended period of time, they managed to get the R300 core out long before nVidia was able to ship their DX9 part and as such games ended up being optimized for their parts. The current situation is nothing at all like the 3dfx days. One big difference is that in terms of marketshare, nVidia is still the leader in add in graphics by a very sizeable margin. 3dfx never had any meaningful marketshare in the OEM space, ATi and nVidia have been the big two for add in boards in that arena for about five years now, even if the enthusiast market didn't show that at all time. Another difference between 3dfx and nVidia, and a very important one, nVidia is still quite profitable. 3dfx was bleeding money for years, nVidia has a bankroll that is the largest in their market.

If you look at the situation from a market angle, nVidia isn't doing badly at all. Back in the Voodoo2 SLI days ATi was the dominant player in this market without any noteworthy enthusiast market support, nVidia seems to be doing the same now. nV also has the advantage of not having to face a next gen ATi part for almost another year, ATi has stated that they are moving to a 24month lifespan for generational leaps.

Overall, the gloom and doom about nVidia's future is that it fairs poorly in two benchmarks(TR:AOD, HL2) one of which is a major title. That brings up the last issue with nVidia ATM. What if Valve had used the Doom3 rendering engine? That single point would move nVidia from the company with the horrible performance in upcoming games to the company that runs the upcoming games best in the eyes of the majority of the enthusiast market.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Everybody seems to forget it was do-or-die for ATI.

ati FOCUSED while nVidia felt it could afford to COAST. Nvidia was arrogant and p-o'd a lot of the industry with their MARKETing-DRIVEN (not engineering-driven) company.

nVidia's miscalculation with "the IEEE shader specs and what Carmack wanted for Doom3" was a result of their arrogance and marketing driven company . . . and it cost them prestige and the technology leadership in GPUs.

No, it won't kill nVidia . . . but it was "do" for ATI and they have nicely taken advantage of almost every nVidia miscalculation . . . ATI played "smart" and had gained market share at nVidia's expense.

And YES, I did predict this 2 years ago. :p
(I've admired ATI ever since they attempted to get back in the game with the Rage GPU)
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
ati FOCUSED while nVidia felt it could afford to COAST. Nvidia was arrogant and p-o'd a lot of the industry with their MARKETing-DRIVEN (not engineering-driven) company.

How did you come to this conclusion? If ATi isn't driven by the market, they will be dead inside of three years. You engineer your part around what the market wants. That doesn't mean what you or I want, it matters what the majority want. If they feel otherwise, they will be another IP purchase in a few years. I in no way think ATi is close to that ignorant, they have been around for too long.

For engineering goals, ATi stuck with the same build process that had been in use for some time, they went with a memory configuration that had been tested and proven to work, and they used a straightforward shader implementation that was simpler to get working. If anything, nVidia would have been a lot better off if they coasted as much as ATi did. If they had gone with .15u, a straightforward shader implementation and stuck with the 256bit bus in the first place for the NV30 noone would be talking about nVidia being in such a rough spot in the enthusiast community right now.

nVidia's miscalculation with "the IEEE shader specs and what Carmack wanted for Doom3" was a result of their arrogance and marketing driven company . . . and it cost them prestige and the technology leadership in GPUs.

Really? Look at the history of the market, even if you narrow it down to the enthusiast market. Where Carmack goes....

This time may end up different, we have seen a few games that show there will be exceptions, but how is following someone elses lead a sign of arrogance? For that matter, if ATi ends up right in the end, with more games within the lifespan of the current gen chips falling closer to HL2 then Doom3, then are they the arrogant market driven company? Is whoever provides what the market wants the market driven company? Or are you stating that because nV has a far more effective marketing department(remember there are 100 idiots who will believe anything for every informed buyer ;) ) no good enginees work there? You do realize that they are different departments right? ;)

No, it won't kill nVidia . . . but it was "do" for ATI and they have nicely taken advantage of almost every nVidia miscalculation . . . ATI played "smart" and had gained market share at nVidia's expense.

nVidia has gained marketshare also, it isn't like this is a two horse race(unless you ignore ATi and focus on the market leaders in graphics, Intel and nV). ATi didn't play smart enough to have nV surpass them in DX9 part installed base within a couple of months. If ATi pulled the R3xx core off perfectly they would have a R9300 or comparable part that would have been a lower clocked 4 pipe DX9 card to compete in the sub $100 range. If they would have done that, nVidia might be facing actual marketshare loss instead of reduced rate of growth. This would have been particularly nasty for nV if ATi would have pulled it off back at the launch of the R300, they would have been able to seriously hurt nV in the add in board market, enough to impact their bottom line and bolster ATi's even more. The FX5200 as of this point should have surpassed the sales of the entire R3xx line of boards combined(handily actually). That is the board that is pretty much slammed across the board as being a complete POS by nearly every review site and enthusiast, how did that happen? If ATi was really as on the ball as some people make them out to be, they could have bolstered their position significantly in terms of marketshare and profitability. As it stands now, ATi- the 'engineering' based company is saying they intend to wait 24months between each new architecture due to the costs of engineering the parts. Maybe if they had shipped a DX9 budget class part in a timely fashion we would be looking at their next gen core in a couple of months instead of next summer. Unfortunately, ATi didn't engineer one of those in time ;)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Sorry Ben, can't agree with you even though you are 'dead on' on the technology details.
(and I am ONLY focusing on the GPU market)

Nvidia's arrogance blinded them to the realities of graphics situation a year or so ago. They tried to make THEIR graphics solutions propriatory. They tried to "take-on" MS. They failed.

It's pretty simple really. nVidia's "Marketing-driven" hype . . . they believed their own BS.

ATI focused on the realities of the graphics market 2 years ago and let their engineers design the best GPU possible while nVidia fought everyone (3DMark; tech sites) and blamed everyone else (the manufacturer of the .13 process) with extreme arrogance.

As to the 2-year cycle, it's about time. phew!
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Nvidia's arrogance blinded them to the realities of graphics situation a year or so ago. They tried to make THEIR graphics solutions propriatory. They tried to "take-on" MS. They failed.

The major design choices for the NV3X were finalized long before MS finalized the DX9 specs. nVidia went with the existing IEEE standard, and they also offered support for the mode that Carmack had been requesting. I don't see the fault in their choice in terms of which standards to work with. I can see the fault in some of the design choices in implementing the technology, although from their perspective FP32 was almost certainly included for the Quadro line and simply left enabled for the FX. They weren't trying to take on MS or any other such thing, they had a design and after they already had their design MS decided to go a different route. Same thing happened with the ARB(which surprised a lot of people). MS decided to go with a completely new standard that ignored what had already been done. If you want to talk about arrogance, you could certainly make the argument that ATi was the arrogant one. They are the ones that decided to go with a precission standard that noone had used before and had MS decided to go with FP32 as the baseline, ATi would be in a lot worse shape then nV is now. I can understand stating that in retrospect it wasn't the best move nVidia could have made, but I don't see why you consider it arrogant. Perhaps you can explain it?

ATI focused on the realities of the graphics market 2 years ago and let their engineers design the best GPU possible while nVidia fought everyone (3DMark; tech sites) and blamed everyone else (the manufacturer of the .13 process) with extreme arrogance.

I make a very clear distinction in companies- marketing and engineering are two completely different sides. Its engineering's job to make the best product for the market they can, its marketing's job to make whatever product they have look the best it can. nVidia's engineering department is still extremely good at what they do. They made it clear that they were looking at the Doom3 engine as a target of sorts for the NV3X hardware, and it seems as if they hit that target extremely well. The failure of TSMC to bring out .13u on time was a gamble by nVidia, and was actually less of one then what ATi did with their precission support for the R300. Obviously nVidia's gamble fell flat and they ended up late. ATi's gamble, if it went the wrong way, would have left them with a complete dead end part(if DX had been FP32).

Let me ask you this, is AMD being arrogant releasing x86-64 CPUs, or is Intel being arrogant by not? What if AMD did, and then MS refused to support it? What do you think would happen to AMD? Is Intel being arrogant with IA-64?

One of the companies is trying to create a major wave in the industry by introducing a radical new architecture, while the other is expanding on a tested platform making some big improvements, which company would you consider arrogant? I can debate on either side of the CPU battle, as I can with the GPU. I can see the reasoning behind ATi's, nVidia's, AMD's and Intel's stances with their respective products, and I don't think any of them are displaying arrogance in terms of their engineering departments(as anyone who has really looked at any of the products can tell you that all of them are very well engineered if they are operating under the conditions they were designed for).
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
Overall, the gloom and doom about nVidia's future is that it fairs poorly in two benchmarks(TR:AOD, HL2) one of which is a major title. That brings up the last issue with nVidia ATM. What if Valve had used the Doom3 rendering engine?

Thats funny you say that, because I can quote you saying Doom 3 wouldn't be sh*t without Pixel shading.
 

arcenite

Lifer
Dec 9, 2001
10,660
7
81
Originally posted by: Regs
Overall, the gloom and doom about nVidia's future is that it fairs poorly in two benchmarks(TR:AOD, HL2) one of which is a major title. That brings up the last issue with nVidia ATM. What if Valve had used the Doom3 rendering engine?

Well.. besides everything else... Hl2 would probably be 2x expensive and 2x uglier ^^

Bill
 

JackHawksmoor

Senior member
Dec 10, 2000
431
0
0
I thought Doom 3 runs better on ATi's hardware too? I thought the Nvidia 5x00 chips have the same types of problems as they do running Direct X 9 stuff?
 

sandorski

No Lifer
Oct 10, 1999
70,696
6,257
126
Originally posted by: JackHawksmoor
I thought Doom 3 runs better on ATi's hardware too? I thought the Nvidia 5x00 chips have the same types of problems as they do running Direct X 9 stuff?

I'm not sure if it runs better, but like HL2, Carmack had to spend a lot of extra time getting the GF FX chips to work well/right on D3. Carmack had to make a hybrid solution, again like HL2 did, whereas the ATI cards were much easier and standard in the implementation.
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
Originally posted by: oldfart
Originally posted by: Dead Parrot Sketch
I'm still hoping Rendition comes back !
I miss them too!

those thing kicked but, i got 2 rendition vertie 2100s. those were great back in the day. didnt micron buy them?
 

Kazi

Senior member
Jun 7, 2001
637
0
0
I could care less about the brand, if ATI out performs Nvidia (Which they do at the moment) I will go with ATI... if Nvidia outperforms ATI, i'll go with them.

Hopefully by the time I get enough money to buy my vid card, one or the other will have BETTER cards out (PCIX?) and it wont matter what card you have, you will have over 900FPS on Quake3.... lmao
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BenSkywalker
Nvidia's arrogance blinded them to the realities of graphics situation a year or so ago. They tried to make THEIR graphics solutions propriatory. They tried to "take-on" MS. They failed.

The major design choices for the NV3X were finalized long before MS finalized the DX9 specs. nVidia went with the existing IEEE standard, and they also offered support for the mode that Carmack had been requesting. I don't see the fault in their choice in terms of which standards to work with. I can see the fault in some of the design choices in implementing the technology, although from their perspective FP32 was almost certainly included for the Quadro line and simply left enabled for the FX. They weren't trying to take on MS or any other such thing, they had a design and after they already had their design MS decided to go a different route. Same thing happened with the ARB(which surprised a lot of people). MS decided to go with a completely new standard that ignored what had already been done. If you want to talk about arrogance, you could certainly make the argument that ATi was the arrogant one. They are the ones that decided to go with a precission standard that noone had used before and had MS decided to go with FP32 as the baseline, ATi would be in a lot worse shape then nV is now. I can understand stating that in retrospect it wasn't the best move nVidia could have made, but I don't see why you consider it arrogant. Perhaps you can explain it?

ATI focused on the realities of the graphics market 2 years ago and let their engineers design the best GPU possible while nVidia fought everyone (3DMark; tech sites) and blamed everyone else (the manufacturer of the .13 process) with extreme arrogance.

I make a very clear distinction in companies- marketing and engineering are two completely different sides. Its engineering's job to make the best product for the market they can, its marketing's job to make whatever product they have look the best it can. nVidia's engineering department is still extremely good at what they do. They made it clear that they were looking at the Doom3 engine as a target of sorts for the NV3X hardware, and it seems as if they hit that target extremely well. The failure of TSMC to bring out .13u on time was a gamble by nVidia, and was actually less of one then what ATi did with their precission support for the R300. Obviously nVidia's gamble fell flat and they ended up late. ATi's gamble, if it went the wrong way, would have left them with a complete dead end part(if DX had been FP32).

Let me ask you this, is AMD being arrogant releasing x86-64 CPUs, or is Intel being arrogant by not? What if AMD did, and then MS refused to support it? What do you think would happen to AMD? Is Intel being arrogant with IA-64?

One of the companies is trying to create a major wave in the industry by introducing a radical new architecture, while the other is expanding on a tested platform making some big improvements, which company would you consider arrogant? I can debate on either side of the CPU battle, as I can with the GPU. I can see the reasoning behind ATi's, nVidia's, AMD's and Intel's stances with their respective products, and I don't think any of them are displaying arrogance in terms of their engineering departments(as anyone who has really looked at any of the products can tell you that all of them are very well engineered if they are operating under the conditions they were designed for).
I am speaking of nVidia's arrogance in general . . . . Reread the news archives if your selective memory is failing. ;)


rolleye.gif






 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Thats funny you say that, because I can quote you saying Doom 3 wouldn't be sh*t without Pixel shading.

No, you can't. I said that it won't be sh!t without shaders, not pixel shaders in particular :) . Also, I was talking about the visuals of the game(can't wait to see what Raven does with the engine).

I'm not sure if it runs better, but like HL2, Carmack had to spend a lot of extra time getting the GF FX chips to work well/right on D3. Carmack had to make a hybrid solution, again like HL2 did, whereas the ATI cards were much easier and standard in the implementation.

What 'standard' was Carmack using for the first nearly three years he was working on the D3 engine? The 'ARB2' path wasn't using an OpenGL standard, the spec wasn't finalized. Carmack has gone on record numerous times stating that the GeForceFX is pretty much exactly what he was looking for to run Doom3, he has done so by stating what the D3 engine would need. You can say that was money talking, but he started saying it back in 1999(look up his old comments from that era about what his next engine would need, sounds like a FX). If ATi offered the ability to utilize the features that nV can, he would have coded a special path for them also(he already stated as much and did for the R200).

I am speaking of nVidia's arrogance in general . . . . Reread the news archives if your selective memory is failing.

You have been talking about their engineering, and that is what I am asking about.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BenSkywalker
You have been talking about their engineering, and that is what I am asking about.
I have been talking about the entire company - engineering and marketing.

They have been quite arrogant (just more defensive, now).


When I say "marketing-driven", I mean the direction given to the engineers are determined by the marketing department.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I have been talking about the entire company - engineering and marketing.

They have been quite arrogant (just more defensive, now).

I'm asking what has the engineering department done that is arrogant.

When I say "marketing-driven", I mean the direction given to the engineers are determined by the marketing department.

I think you'll find that when a chip enters the design phase the features they are implementing is something the marketing department has no clue about :)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BenSkywalker
I have been talking about the entire company - engineering and marketing.

They have been quite arrogant (just more defensive, now).

I'm asking what has the engineering department done that is arrogant.

When I say "marketing-driven", I mean the direction given to the engineers are determined by the marketing department.

I think you'll find that when a chip enters the design phase the features they are implementing is something the marketing department has no clue about :)
I didn't single out nVidia's engineering department as arrogant . . .

BEFORE the GPU enters the design stage - at nVidia at least - the engineers are given a 'direction' (specific 'targets') by their "marketing-driven company".

Engineers are told to design "features" that the company believes will sell. . . . it is my OPINION, that ATI engineers are given more "freedom" . . . and my opinion is based on what I have read over the last few years.

It isn't important for me to nitpick issues with you. I don't feel like looking up my archived (now) sources.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I didn't single out nVidia's engineering department as arrogant . . .

I didn't state that their marketing department wasn't arrogant.

BEFORE the GPU enters the design stage - at nVidia at least - the engineers are given a 'direction' (specific 'targets') by their "marketing-driven company".

Of course those marketing guys are geniuses when it comes to determining if they use a BPU or execute all conditional branches in tandem in terms of which approach will offer them the biggest ROI....

Engineers are told to design "features" that the company believes will sell. . . . it is my OPINION, that ATI engineers are given more "freedom" . . . and my opinion is based on what I have read over the last few years.

Following existing standards is such a dastardly approach too....

It isn't important for me to nitpick issues with you. I don't feel like looking up my archived (now) sources.

Nitpick? I've been asking for a single example of why you think nVidia is more "marketing driven" then ATi on an engineering basis. If I want baseless conjecture I could always look to the Inquirer, but I want one example of this. Tell me about a feature that was clearly marketing driven on the engineering front.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I have to ask and I'm being quite serious, are you heavily medicated at the moment? It is the only logical conclusion I can come up with in response to the question asked versus your reply. I ask how the engineering deprtment is driven by their marketing department and you answer with links about review site issues and financial reports....? What their marketing department does when they are doing their job in no way comprehensible has anything with them telling the engineering department what to do nor does it show any hint of arrogance in the engineering department.
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
I think their marketing department did a great job trying to sell the FX series. I mean look at it, it's wrapped around a huge ass fan with heat sinks bottom to top. Yeah.. you, ben, are spending to much time justifying Nvidia and not their FX video card series.

There is nothing wrong with Nvidia, they are not "Evil". But people seem to be a little ignorant to even try to defend the FX series compared to a 9700 pro type card just because it was made by a good company. Why can't people accept that the 9700 pro architecture is just better? Is it only Nvidia that could make the technology break throughs?
It seems like it, because everytime someone states the 9700pro is better, people reply back: "Wait until the new Nvidia core comes out"; in a nutt shell.

Marketing. Bias views, blinded by the facts. The FX series is a perfect by-product of being on the bleeding edge.
 

Tom

Lifer
Oct 9, 1999
13,293
1
76
I would say it was more the accounting dept than the marketing dept. Nvidia had a performance edge over ATI and felt they could afford to spend some time moving to the smaller size production which should reduce costs, I think, and ultimately allow for better future products.

At the same time ATI decided to go all out to get the performance crown, they had tried to do this with the 8500 but Nvidia had countered that. ATI was very determined not to lose the next round.

Put those 2 things together and ATI ended up with a good sized lead. I think Nvidia was caught so off guard that they felt they couldn't afford to take the time to completely redesign, so they are trying to stretch their now out of date platform as far as they can, while probably simultaneously design a fuller response. So far they haven't done too well.

 

DannyBoy

Diamond Member
Nov 27, 2002
8,820
2
81
www.danj.me
Originally posted by: Regs
I think their marketing department did a great job trying to sell the FX series. I mean look at it, it's wrapped around a huge ass fan with heat sinks bottom to top. Yeah.. you, ben, are spending to much time justifying Nvidia and not their FX video card series.

There is nothing wrong with Nvidia, they are not "Evil". But people seem to be a little ignorant to even try to defend the FX series compared to a 9700 pro type card just because it was made by a good company. Why can't people accept that the 9700 pro architecture is just better? Is it only Nvidia that could make the technology break throughs?
It seems like it, because everytime someone states the 9700pro is better, people reply back: "Wait until the new Nvidia core comes out"; in a nutt shell.

Marketing. Bias views, blinded by the facts. The FX series is a perfect by-product of being on the bleeding edge.

I must admit the amount of times ive heard someone say "Wait until the new nvidia cards come out" is incredible.

At the rate things have been going, if you were to do that you would be around a while before (if even) a card from nvidia was top again...