bryanW1995
Lifer
- May 22, 2007
- 11,144
- 32
- 91
Originally posted by: SonicIce
I just wish Rollo was still here for all this
he probably is, he just changed his name.
Originally posted by: SonicIce
I just wish Rollo was still here for all this
Originally posted by: DefRef
Yawn... Gee, is it "ATI fanboys squealing like a Miley Cyrus audience over how their beloved Red company will trounce the Green team" time again?
Listen, kids, ATI put out a new card first and nVidia will have one coming later. There are two ways this can be spun:
1. The ATI fanboy way - "HA-HA! nVidia is crapping themselves because the new ATI 59000000000000000000000HDROFLCOPTERZOMGBBQ is teh fastester and since they aren't rushing to announce their GTX300 cards, it can only mean one thing - they got NUTHIN!!! Bwahahahahahaha!!!"
B. The sane person way - "ATI has unveiled their new card and it's quite impressive. However, now nVidia knows exactly where the price/power bar is set and could respond with something that smokes ATI's offering and offer gamers a better value. Since ATI has nothing to retaliate with for at least a year, those early sales will be pretty much it."
IF nVidia matches and/or exceeds what ATI has - pretty likely since, other than the 9700 vs. 5800 days, nVidia's had ATI's number for a decade - then I wonder how many people whooping it up now will become whiners that they got "stuck" with a slow card? Heh.
Originally posted by: OCguy
Originally posted by: SlowSpyder
Nvidia can always releasae an x2 part as well, but it seems like their x2 parts are more reactionary than planned... I could just see an x2 Nvidia part taking a while to show up, assuming it's needed.
nV releases X2 cards every generation. There was a report months ago they are working on a GT300 gx2 card.
Trying to discredit what could be the fastest single card for the next year or so by saying it "isnt planned" doesnt really work.
I am fairly certain the single PCB 295 was nV's test-run with a non-sandwich gx2 card. You really think they made that switch in the middle of the product's lifecycle for the hell of it?
Originally posted by: bryanW1995
didn't they make that switch after they went to 55nm? They probably made it as a 2 pcb solution originally b/c they couldn't make it work with high-enough clock speeds on a single pcb at 65nm, then switched to the less expensive design when the switch to 55nm allowed a die shrink.
Originally posted by: OCguy
Originally posted by: bryanW1995
didn't they make that switch after they went to 55nm? They probably made it as a 2 pcb solution originally b/c they couldn't make it work with high-enough clock speeds on a single pcb at 65nm, then switched to the less expensive design when the switch to 55nm allowed a die shrink.
I was under the impression that the original 295 was already 55nm, because it was 2x275s, but I could be wrong.
Originally posted by: OCguy
Originally posted by: Azn
Yeah ATI really dropped the ball on Nvidia last round and continuing the same saga this round.
Who would have thought making upper mid-range chips and making high performance dual GPU cards would hurt Nvidia with their huge single chip design.
It made Nvidia imitate ATI's business strategy last round. This round is no different.
Now who here thinks GT300 can't be made into a gx2 card because of Nvidia's power hungry design and lose to 5870x2 cards once it's released?
I think it is too early to tell exactly what is happening this round. ATi is getting some early adopters, but there are a ton of people waiting to see what happens with G300.
If a single G300 is faster than the 5870, then I would bet the farm that the gx2 part is faster than the x2 part as well.
Gaining 3% marketshare and still having a net loss in money, when you have all the hype and great price/performance like 4890, isnt exactly "dropping the ball" on nV.
Of course this could all be moot if G300 comes out and does not compete with 5XXX. At that point, ATi would have both value and top performance. You would see a massive marketshare shift in AMD's favor.
Originally posted by: Idontcare
thilan that doesn't really negate my point in regards to making diesize comparisons as the basis for cost estimations alone, if that isn't self-evident then I'll concede I failed to properly make my point to begin with and I'll happily go back to the drawing board and try again if think you would find value in my doing so. I don't mind making the effort.
Originally posted by: Azn
It is early but 5870 is really putting the pressure on Nvidia. Nvidia has a tough road ahead of them.
As for gx2. That was the main question. I doubt Nvidia will make a gx2 unless Nvidia went to the same strategy as 5870 low power consumption which is highly unlikely and still perform faster than 5870. Will gx2 on the GT300 even possible? Perhaps a cut version of the card like how they did GTX295. That means Nvidia will have to perform 30% faster than 5870 to be competitive or something of that nature.
Originally posted by: OCguy
Maybe they concentrated more on CGGPU so 3D gaming suffers? Maybe it is 25% faster than 5XXX? Nobody knows.
Originally posted by: OCguy
Originally posted by: SlowSpyder
Nvidia can always releasae an x2 part as well, but it seems like their x2 parts are more reactionary than planned... I could just see an x2 Nvidia part taking a while to show up, assuming it's needed.
nV releases X2 cards every generation. There was a report months ago they are working on a GT300 gx2 card.
Trying to discredit what could be the fastest single card for the next year or so by saying it "isnt planned" doesnt really work.
I am fairly certain the single PCB 295 was nV's test-run with a non-sandwich gx2 card. You really think they made that switch in the middle of the product's lifecycle for the hell of it?
Originally posted by: Idontcare
Originally posted by: mmnno
Thinking more logically, nVidia probably will not let ATi beat them in value. They will either release a more powerful card at a higher price, which enthusiasts will be happy to pay if it is reasonable, or they will release a competitive card and force ATi to cut prices.
The only way nVidia can disappoint is if they release a competitive card at a higher price, thinking that features like PhysX and CUDA give them pricing power. That would suck but I suspect ATi might cut prices anyway, so I'm confident that waiting will bring better deals.
Also who is to say that NV has any interest in lowering prices? They might just like the price/performance level set by AMD and they choose to simply align the ASP's of the SKU's to that price/performance segmentation.
Originally posted by: thilan29
Originally posted by: Idontcare
thilan that doesn't really negate my point in regards to making diesize comparisons as the basis for cost estimations alone, if that isn't self-evident then I'll concede I failed to properly make my point to begin with and I'll happily go back to the drawing board and try again if think you would find value in my doing so. I don't mind making the effort.
Did you mean just the die selling for less than $300 or the whole card? Is it possible that it was the AIBs that were losing money and not nV?
Originally posted by: Idontcare
I was just speaking in regards to the IC itself, regardless whether they actually sold it for $200 or $300 or $400 my point is that it was designed to be capable of being manufactured so as to enable a pricepoint that is basically an order of magnitude lower than that for comparably sized chips.
Originally posted by: thilan29
Originally posted by: Idontcare
I was just speaking in regards to the IC itself, regardless whether they actually sold it for $200 or $300 or $400 my point is that it was designed to be capable of being manufactured so as to enable a pricepoint that is basically an order of magnitude lower than that for comparably sized chips.
I can't find the link anymore but there was a table showing the cost of each component on a card and it put the GTX285 die at $90 (probably for the AIB partner so even less for nV?) and knowing now what you've said I'm glad they ARE able to sell the whole card for so cheap.