AMD responds to "tough" questions from nVidia

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: thilan29
Originally posted by: DefRef
Since ATI has nothing to retaliate with for at least a year[/i]

That may not be necessarily true...they could do something similar to the 4890 and release an even more competitive card.

Um...4890 = OC'd 4870. You can pretty much predict what a 5890 would be by overclocking the core to 1ghz and ocing the ram as well.

5870X2 you can predict by the 5870 Xfire results.


There is pretty much nothing else for ATi to do other than sit back and hope nV drops a massive turd this gen.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
The anti ATI bias is so thick in here!

There's no guarantee the GT300 will even be faster than the 5870. That is speculation and assumption all by itself. Now if it is faster, what are the odds that it will be a good value? Rumors say the GT300 is a new architecture, the largest, most complex, most expensive GPU Nvidia will have ever produced. The chances of it being able to compete price wise with the 5870 are nil. If Nvidia launches the GT300 and it is 30-50% faster than the 5870 for $600, they are done like dinner because they have nothing to beat the 5870x2 that they will be competing against in that price segment.
 

thilanliyan

Lifer
Jun 21, 2005
12,064
2,277
126
Originally posted by: OCguy
Um...4890 = OC'd 4870. You can pretty much predict what a 5890 would be by overclocking the core to 1ghz and ocing the ram as well.

And it ended up competing with the GTX275 and sometimes even with the GTX285 so it stands to reason that IF the 5870 only competes with the GTX360 then they could release a 5890 to compete with the GTX380 (or whatever they're new cards are called) pretty easily could they not?
 

JSt0rm

Lifer
Sep 5, 2000
27,399
3,948
126
Originally posted by: DefRef
Yawn... Gee, is it "ATI fanboys squealing like a Miley Cyrus audience over how their beloved Red company will trounce the Green team" time again?

Listen, kids, ATI put out a new card first and nVidia will have one coming later. There are two ways this can be spun:

1. The ATI fanboy way - "HA-HA! nVidia is crapping themselves because the new ATI 59000000000000000000000HDROFLCOPTERZOMGBBQ is teh fastester and since they aren't rushing to announce their GTX300 cards, it can only mean one thing - they got NUTHIN!!! Bwahahahahahaha!!!"

B. The sane person way - "ATI has unveiled their new card and it's quite impressive. However, now nVidia knows exactly where the price/power bar is set and could respond with something that smokes ATI's offering and offer gamers a better value. Since ATI has nothing to retaliate with for at least a year, those early sales will be pretty much it."

IF nVidia matches and/or exceeds what ATI has - pretty likely since, other than the 9700 vs. 5800 days, nVidia's had ATI's number for a decade - then I wonder how many people whooping it up now will become whiners that they got "stuck" with a slow card? Heh.

what about points 2. and A?
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: thilan29
Originally posted by: OCguy
Exactly. Is AMD actually pushing the release of DX11 games to try and get them out before G300 hits, so people think "Oh, this is why I need to buy 5XXX right now! Dirt 2!"? If so, it is a valid question.

No I think the release of Dirt2 on PC has been delayed (not pulled forward) to include DX11 (it's already out for consoles)...but it's the same thing that happened with Arkham Asylum and PhysX so nV's statement is a bit ridiculous in that sense.

I agree with you totally, if you ask me it appears that Nvidia is worried about the 5870 lol ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: thilan29
Originally posted by: OCguy
Exactly. Is AMD actually pushing the release of DX11 games to try and get them out before G300 hits, so people think "Oh, this is why I need to buy 5XXX right now! Dirt 2!"? If so, it is a valid question.

No I think the release of Dirt2 on PC has been delayed (not pulled forward) to include DX11 (it's already out for consoles)...but it's the same thing that happened with Arkham Asylum and PhysX so nV's statement is a bit ridiculous in that sense.

One minor difference. There were already PhysX titles out. If ATI is smart, they will use Dirt2 with DX11 add ons as a selling point for their DX11 hardware and try to capitalize on it in the short time they have being the only DX11 game in town.

 

zebrax2

Senior member
Nov 18, 2007
977
70
91
Originally posted by: Keysplayr
Originally posted by: thilan29
Originally posted by: OCguy
Exactly. Is AMD actually pushing the release of DX11 games to try and get them out before G300 hits, so people think "Oh, this is why I need to buy 5XXX right now! Dirt 2!"? If so, it is a valid question.

No I think the release of Dirt2 on PC has been delayed (not pulled forward) to include DX11 (it's already out for consoles)...but it's the same thing that happened with Arkham Asylum and PhysX so nV's statement is a bit ridiculous in that sense.

One minor difference. There were already PhysX titles out. If ATI is smart, they will use Dirt2 with DX11 add ons as a selling point for their DX11 hardware and try to capitalize on it in the short time they have being the only DX11 game in town.

They already do IIRC they are bundling Dirt 2 with 5870.

Originally posted by: JSt0rm01
Originally posted by: DefRef
Yawn... Gee, is it "ATI fanboys squealing like a Miley Cyrus audience over how their beloved Red company will trounce the Green team" time again?

Listen, kids, ATI put out a new card first and nVidia will have one coming later. There are two ways this can be spun:

1. The ATI fanboy way - "HA-HA! nVidia is crapping themselves because the new ATI 59000000000000000000000HDROFLCOPTERZOMGBBQ is teh fastester and since they aren't rushing to announce their GTX300 cards, it can only mean one thing - they got NUTHIN!!! Bwahahahahahaha!!!"

B. The sane person way - "ATI has unveiled their new card and it's quite impressive. However, now nVidia knows exactly where the price/power bar is set and could respond with something that smokes ATI's offering and offer gamers a better value. Since ATI has nothing to retaliate with for at least a year, those early sales will be pretty much it."

IF nVidia matches and/or exceeds what ATI has - pretty likely since, other than the 9700 vs. 5800 days, nVidia's had ATI's number for a decade - then I wonder how many people whooping it up now will become whiners that they got "stuck" with a slow card? Heh.

what about points 2. and A?

:laugh:
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: thilan29
Originally posted by: OCguy
Um...4890 = OC'd 4870. You can pretty much predict what a 5890 would be by overclocking the core to 1ghz and ocing the ram as well.

And it ended up competing with the GTX275 and sometimes even with the GTX285 so it stands to reason that IF the 5870 only competes with the GTX360 then they could release a 5890 to compete with the GTX380 (or whatever they're new cards are called) pretty easily could they not?

Expanding on this a bit, depending how long it takes for Nvidia to get the GT300 out, by the time it does appear the 5870x2 could be out. Isn't that supposed to be out by November? If the 5870x2 is out before the GT300, then I see Nvidia in real trouble, unless the GT300 is that good that it beats 2x5870 performance... which I doubt it would, personally.

Nvidia can always releasae an x2 part as well, but it seems like their x2 parts are more reactionary than planned... I could just see an x2 Nvidia part taking a while to show up, assuming it's needed.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: DefRef
Yawn... Gee, is it "ATI fanboys squealing like a Miley Cyrus audience over how their beloved Red company will trounce the Green team" time again?

Listen, kids, ATI put out a new card first and nVidia will have one coming later. There are two ways this can be spun:

1. The ATI fanboy way - "HA-HA! nVidia is crapping themselves because the new ATI 59000000000000000000000HDROFLCOPTERZOMGBBQ is teh fastester and since they aren't rushing to announce their GTX300 cards, it can only mean one thing - they got NUTHIN!!! Bwahahahahahaha!!!"

2. The sane person way - "ATI has unveiled their new card and it's quite impressive. However, now nVidia knows exactly where the price/power bar is set and could respond with something that smokes ATI's offering and offer gamers a better value. Since ATI has nothing to retaliate with for at least a year, those early sales will be pretty much it."

IF nVidia matches and/or exceeds what ATI has - pretty likely since, other than the 9700 vs. 5800 days, nVidia's had ATI's number for a decade - then I wonder how many people whooping it up now will become whiners that they got "stuck" with a slow card? Heh.

You forgot (and I will quote bryan here because his reply was perfect)

3. The Nvidia Way - "We don't know when gt300 will be released. 5xxx is kicking our ass, so we're going to throw up more physix smoke and mirrors. We might not even beat 5870 as everyone expects. Think nv30 vs 9700 pro. By the way, are you guys hiring?"

Also, its too late in the game for Nvidia to change the specifications of the GT300 other than clocks, unless they have given up on a 2009 launch, so if it sucks, it sucks, nothing you can do about it (not saying it does, Im pretty sure it will be faster)
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: OneOfTheseDays
AMD does not care about the ultimate performance crown. That's not where the money is necessarily. It's all about price/performance, and it's quite clear that they have done very well in that sector of the market.

The 5870 is no exception. I have no doubt Nvidia will likely release a card that bests the 5870, but at what cost? What will its power consumption numbers be? How monstrous a card will it have to be?

IMO this is nothing but a 2nd place finisher justifying being first loser. AMD is imo required to have the top performing CPU and GPU due to the position they are in. They can play the nice guy price\performance tune to the internet fanbois. But at the end of the day all it has done for them is mountains of debt with little hope of recovery.

Steam survey results dont lie. Since the CD2 intro AMDs CPU market share in the gaming world has gone from ~55-60% to ~30%. In the GPU market since the debut of the 6800 series they went from 50% to 35%.

People abandoned them as they became 2nd fiddle to the other market player.

 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: dguy6789
The anti ATI bias is so thick in here!

There's no guarantee the GT300 will even be faster than the 5870. That is speculation and assumption all by itself. Now if it is faster, what are the odds that it will be a good value? Rumors say the GT300 is a new architecture, the largest, most complex, most expensive GPU Nvidia will have ever produced. The chances of it being able to compete price wise with the 5870 are nil. If Nvidia launches the GT300 and it is 30-50% faster than the 5870 for $600, they are done like dinner because they have nothing to beat the 5870x2 that they will be competing against in that price segment.

You are correct. However the performance goal for Nvidia the last couple of generartions has been the previous generations SLI or X2 chip. So we can look at the 295 and guesstimate where they are planning the performance to be. Whether they make that we wont know until the release. If they do make it then they will be faster than AMD with a single GPU design.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: mmnno
Thinking more logically, nVidia probably will not let ATi beat them in value. They will either release a more powerful card at a higher price, which enthusiasts will be happy to pay if it is reasonable, or they will release a competitive card and force ATi to cut prices.

The only way nVidia can disappoint is if they release a competitive card at a higher price, thinking that features like PhysX and CUDA give them pricing power. That would suck but I suspect ATi might cut prices anyway, so I'm confident that waiting will bring better deals.

Also who is to say that NV has any interest in lowering prices? They might just like the price/performance level set by AMD and they choose to simply align the ASP's of the SKU's to that price/performance segmentation.

We would get more choices but not necessarily better price/performance.

(this is assuming NV has the gross-margin luxury of deciding where to place their ASP's)

Originally posted by: dguy6789
The anti ATI bias is so thick in here!

There's no guarantee the GT300 will even be faster than the 5870. That is speculation and assumption all by itself. Now if it is faster, what are the odds that it will be a good value? Rumors say the GT300 is a new architecture, the largest, most complex, most expensive GPU Nvidia will have ever produced. The chances of it being able to compete price wise with the 5870 are nil. If Nvidia launches the GT300 and it is 30-50% faster than the 5870 for $600, they are done like dinner because they have nothing to beat the 5870x2 that they will be competing against in that price segment.

The part in bold, I really don't see how you can substantiate that. We have no idea how much resources NV invested into the design for manufacturing (DFM) aspects of design of GT300 versus whatever DFM investments AMD put into Cypress.

In the absence of these kinds of development budget numbers we can't just arbitrarily conclude larger more complex designs means more cost to produce the resultant product.

As a process technologist, considering that NV was able to produce this and sell it for <$300 is a testament to their DFM capabilities. You'll be hard-pressed, and I say this with confidence, to find ANY integrated circuit even remotely close to that diesize being sold for less than four figures, and for very good reasons.

Originally posted by: SlowSpyder
Nvidia can always releasae an x2 part as well, but it seems like their x2 parts are more reactionary than planned... I could just see an x2 Nvidia part taking a while to show up, assuming it's needed.

Steve, not defending NV here nor attempting to argue with you as to whether your consumer-level perception is correct, but I would just say that given the length (time) of the development pipeline for products of this complexity it is very difficult to do anything as a reaction to your competitors existing products.

Gen N products compete with Gen N products, by the time you've had the time to tweak a Gen N design to better compete with the competitors Gen N product the Gen N+1 stuff will already be out on the market.

These pipelines are about 2-3 yrs deep/long, really at best a design team can factor in information about existing Gen N products to alter their plans on Gen N+2 designs. They can try and do crisis mode stuff to intersect N+1 designs in the pipe, but man does that wreak havoc on the budget as well as elevate the risk of missing the timeline for Gen N+1 deployment to the extremes.

Just saying that while you as a consumer may be justified in perceiving these guys as dickering with each other in the Gen N landscape (and they do to a certain extent, GPU clocks and prices are the proof) its really just marketing at that point, the products themselves were set in stone a minimum of 12 months prior to you or I seeing them in Newegg.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Considering the direction Nvidia has been going with their GPUs and the direction ATI has been going, that's the conclusion I came up with; that the RV870 is going to be cheaper than the GT300. It's only speculation though. I doubt very many people expect GT300 to be $379, but if it ends up being that much, then that's great.
 

thilanliyan

Lifer
Jun 21, 2005
12,064
2,277
126
Originally posted by: Idontcare
As a process technologist, considering that NV was able to produce this and sell it for <$300 is a testament to their DFM capabilities. You'll be hard-pressed, and I say this with confidence, to find ANY integrated circuit even remotely close to that diesize being sold for less than four figures, and for very good reasons.

They were sort of forced into selling it that low though with the competition they had. And I've read that they were losing money on those chips once they cut prices...although that was never proved beyond a doubt but I could see them taking the hit to keep their marketshare.
 

Stoneburner

Diamond Member
May 29, 2003
3,491
0
76
Why do I keep hearing this myth that Nvidia smoked ATI for every generation after R300? The x series was about equal, the x1900's were about equal. IT was the 2900 series and beyond where ATI fell behind in the top end.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Idontcare

Originally posted by: SlowSpyder
Nvidia can always releasae an x2 part as well, but it seems like their x2 parts are more reactionary than planned... I could just see an x2 Nvidia part taking a while to show up, assuming it's needed.

Steve, not defending NV here nor attempting to argue with you as to whether your consumer-level perception is correct, but I would just say that given the length (time) of the development pipeline for products of this complexity it is very difficult to do anything as a reaction to your competitors existing products.

Gen N products compete with Gen N products, by the time you've had the time to tweak a Gen N design to better compete with the competitors Gen N product the Gen N+1 stuff will already be out on the market.

These pipelines are about 2-3 yrs deep/long, really at best a design team can factor in information about existing Gen N products to alter their plans on Gen N+2 designs. They can try and do crisis mode stuff to intersect N+1 designs in the pipe, but man does that wreak havoc on the budget as well as elevate the risk of missing the timeline for Gen N+1 deployment to the extremes.

Just saying that while you as a consumer may be justified in perceiving these guys as dickering with each other in the Gen N landscape (and they do to a certain extent, GPU clocks and prices are the proof) its really just marketing at that point, the products themselves were set in stone a minimum of 12 months prior to you or I seeing them in Newegg.

For making an x2 card, wouldn't it be possible that building that part could take consderably less time though? No doubt that building the GPU can take years, but I would think (maybe incorrectly) that once they have the GPU building an x2 part could be done somewhat quickly. In Nvidia's case, I would think that building the sandwich card could be done quickly enough, the real work would be in the boards and cooler since the GPU is already there.

If I'm wrong on this, just tell me to sit down and shuddup. :p
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
The clock is ticking for NV. Need I remind you that Black Friday is only 8 weeks away? This gives NV just 8 weeks to not only cough up a product, but to build a positive buzz *AND* make sure there is enough supply to meet the demand of the hungry locust-like hordes doing xmas shopping. A paper launch will be of zero value if all Little Johnny wants a high end DX11 video card for Christmas and the only one available is the 5870. At the midrange and low end it's all about the 48XX vs 2XX -- which is why NV is trying to marginalize DX11. A new ATI midrange product featuring DX11 in time for black friday would be devastating for previous generation product sales.

If NV's next gen product hits next year the only prospective buyers are value vultures like yours truly. My ilk is not likely to overpay for the privilege of being the first kid on the block with a new toy -- we'll hang back and wait for attractive pricing as perceived by us. People struggling under the weight of recently maxed out credit cards do not make for the best market.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: thilan29
Originally posted by: Idontcare
As a process technologist, considering that NV was able to produce this and sell it for <$300 is a testament to their DFM capabilities. You'll be hard-pressed, and I say this with confidence, to find ANY integrated circuit even remotely close to that diesize being sold for less than four figures, and for very good reasons.

They were sort of forced into selling it that low though with the competition they had. And I've read that they were losing money on those chips once they cut prices...although that was never proved beyond a doubt but I could see them taking the hit to keep their marketshare.

thilan that doesn't really negate my point in regards to making diesize comparisons as the basis for cost estimations alone, if that isn't self-evident then I'll concede I failed to properly make my point to begin with and I'll happily go back to the drawing board and try again if think you would find value in my doing so. I don't mind making the effort.

Originally posted by: SlowSpyder
For making an x2 card, wouldn't it be possible that building that part could take consderably less time though? No doubt that building the GPU can take years, but I would think (maybe incorrectly) that once they have the GPU building an x2 part could be done somewhat quickly. In Nvidia's case, I would think that building the sandwich card could be done quickly enough, the real work would be in the boards and cooler since the GPU is already there.

If I'm wrong on this, just tell me to sit down and shuddup. :p

Possible to take considerable less time? Of course. I'd be a fool to try and argue against that.

I'm not trying to make this an open and shut fact-laden argument. It is very much just like that Charlie article about NV yields being 2%, it wasn't impossible for that to be true, but in the scheme of things and how businesses generally operate the relevance of the data (if the data are even true) just doesn't make sense with how reality operates.

Can an X2 product be whipped up, expedited at lightening speed? You bet, 100% absolutely it can be done. At what cost? And to what gain?

Decision makers aren't as rash and quick to change programs as they are portrayed to be by the time their actions get filtered thru sensationalistic headlines in the media to reach our ears.

I am by no means trying to imply that an X2-type product requires anywhere near the development pipeline as a single-GPU derived product, but it isn't practical from any business sense to put the resources on rushing a project from non-existence to market in say less than 6 months time. 9-12 months is an economically viable timeline, still crisis mode and the R&D efficiency for your budget is totally blown, but you aren't going to risk ruination of the company putting a team on that kind of pace.

For example, I can tell you with 99% confidence that right now whatever GT400-derived X2 type product is going to come to the market is already locked in stone on Nvidia's roadmap for both budgetary reasons and project milestone cadence. They could change the specs or plans, but doing so would come with markedly elevated risk to their market intersection timeline.

GT500-based product is still early enough in development that changes are manageable without significantly elevating risk or budget to counter the elevated risk.

I have seen, been a part of, crisis mode style management before and it is so not pretty and actually quite ineffective when it comes to maximizing shareholder value. Doesn't mean it doesn't happen, just means when we reach for our Occam's razor to try and pare down the probable scenarios to explain certain market events, based on experience I personally would be comfortable with ruling out the "sli on a stick" sku as being purely reactionary to AMD's X2 product strategy.

I don't have neatly compiled and presentable data to offer you in this situation to compell you to believe my opinion on this, but we do go back a ways and so I did want to take the time to just try and round-out your perspective on more of the backstory that can go on with these things. If we were in a bar having a few brewskies together right now there is no doubt we'd have scribbled all over a few napkins with some distribution graphs and timeline milestones by now :p :laugh:
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: SlowSpyder
Nvidia can always releasae an x2 part as well, but it seems like their x2 parts are more reactionary than planned... I could just see an x2 Nvidia part taking a while to show up, assuming it's needed.


nV releases X2 cards every generation. There was a report months ago they are working on a GT300 gx2 card.

Trying to discredit what could be the fastest single card for the next year or so by saying it "isnt planned" doesnt really work.


I am fairly certain the single PCB 295 was nV's test-run with a non-sandwich gx2 card. You really think they made that switch in the middle of the product's lifecycle for the hell of it?
 

FalseChristian

Diamond Member
Jan 7, 2002
3,322
0
71
I think nVidia will come come out with something slightly faster than the 5870 and it'll be at a competitive price point. I even think they'll have 1024MB of GDDR5 on a 512-bit bus.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Yeah ATI really dropped the ball on Nvidia last round and continuing the same saga this round.

Who would have thought making upper mid-range chips and making high performance dual GPU cards would hurt Nvidia with their huge single chip design.

It made Nvidia imitate ATI's business strategy last round. This round is no different.

Now who here thinks GT300 can't be made into a gx2 card because of Nvidia's power hungry design and lose to 5870x2 cards once it's released?
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: Azn
Yeah ATI really dropped the ball on Nvidia last round and continuing the same saga this round.

Who would have thought making upper mid-range chips and making high performance dual GPU cards would hurt Nvidia with their huge single chip design.

It made Nvidia imitate ATI's business strategy last round. This round is no different.

Now who here thinks GT300 can't be made into a gx2 card because of Nvidia's power hungry design and lose to 5870x2 cards once it's released?

I think it is too early to tell exactly what is happening this round. ATi is getting some early adopters, but there are a ton of people waiting to see what happens with G300.

If a single G300 is faster than the 5870, then I would bet the farm that the gx2 part is faster than the x2 part as well.

Gaining 3% marketshare and still having a net loss in money, when you have all the hype and great price/performance like 4890, isnt exactly "dropping the ball" on nV.

Of course this could all be moot if G300 comes out and does not compete with 5XXX. At that point, ATi would have both value and top performance. You would see a massive marketshare shift in AMD's favor.

 
Apr 20, 2008
10,067
990
126
Originally posted by: Warren21
Originally posted by: Scholzpdx
I'll get the cheaper of the card once both are out on the market. Which probably means ATI again. And that's unfortunate, as the video card that brought the most innovation and power (to me personally) was my 8800GTS 320mb. That was one powerful card for its time. :sigh:

This doesn't make any logical sense. You'd be sad if you 'had' to get a 5xxx card instead of GT3xx-based card because you enjoyed your 8800 GTS 320 so much?

What does the 8800's supposed innovation and past price/performance have to do with the relative innovation and price/performance of either RV870 or GT300? The Radeon 9500 Pro's were very powerful and fully featured for their time and price as well. It's also totally irrelevant.

Each team brings innovation and cost savings to the table, they each deserve kudos no matter if your pixels are green or red.

All I sense is from the above is bias... Which is fine, just don't hide it.

***
More on topic:

I agree I'd like to see more info from nV than just stone throwing and FUD-inspired questions. This isn't a response typical of their undeniable performance leadership we've seen in the past few years.

Someone saying that I have a bias towards nVidia.

I think I've seen it all.