Nvidia stockpiles 55nm parts for a massive assult on ATI

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: keysplayr2003
Bold 3: How does this matter to you? Me? Any end user?
What in the world is this infatuation with company costs? As long as YOU, the end user pays a reasonable price for a certain performance segment, that is where your infatuation should end. Because if it doesn't end, that means that you care more about the company than you do the hardware. Which does not make sense.
I rather like this post, so I think I'll repost it.

This forum seems to have a dual purpose. One is to weigh and evaluate our options as consumers of video cards, which is obviously a chief preoccupation among us. The other is to comment on the state of the industry as a whole. This second area is intrinsically interesting, but also relevant to the first area. Profit margins for ATI and NV impact our expectation for competition in future product cycles. It is particularly relevant with this latest generation that AMD's graphics division seems to have returned to profitability as many of us want to see it remain competitive, lest we once again revisit the G80 situation where the minimum buy-in for acceptable performance was $300.

- woolfe
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com

i am looking forward to this "massive assault"

Tesla sure was an expensive disappointment for gamers

Let's see what Nvidia comes up with this time; i think they learn quickly from their miscalculations - faster than ATi ever did

rose.gif


 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Originally posted by: keysplayr2003
Bold 1: See where your head is at? I attempted to "get" him to come up with a figure.
What really happened: I asked him because he sounded like he knew, and I didn't.

Calm down, keys... All biostud said was:

It will still be more expensive to produce than the RV770 core, but more competition is always good.

I'm not sure how you equate that to "he sounded like he knew, and I didn't". Perhaps he did. I simply expounded on his statement and explained in greater detail. From your statement it seems you were interested in the answer, so I attempted to explain it to you. Isn't that what you were after? An answer?



Originally posted by: keysplayr2003
Bold 2: What are you talking about? GTX 260 core 216 is even cheaper than a 48701GB card right now, and some offers include bundled FarCry2! If what you say is so true, then AMD should be able to offer their 4870 at a cheaper price with a bundled game too? Right?

Yes, they should. For a while, the 4870 was cheaper than the GTX260. Now Nvidia has dropped the price even lower. Some people are speculating that Nvidia is attempting to dump stock of 65nm GTX260s to make way for their new 55nm 260s. It's possible this is simply a fire sale in order to just get rid of them. Maybe it isn't. I doubt anybody here is privy to that information, so all we can do is guess.



Originally posted by: keysplayr2003
Bold 3: How does this matter to you? Me? Any end user?
What in the world is this infatuation with company costs? As long as YOU, the end user pays a reasonable price for a certain performance segment, that is where your infatuation should end. Because if it doesn't end, that means that you care more about the company than you do the hardware. Which does not make sense.

If it doesn't matter to you, then why on earth did you ask the question in the first place?

Originally posted by: keysplayr2003
How much does each cost to manufacture?
 

rjc

Member
Sep 27, 2007
99
0
0
Originally posted by: keysplayr2003
Originally posted by: BladeVenom
This sounds like bad news for Nvidia. They have stockpiled a 3 months supply of 55nm parts. Since I guess they can't get rid of all the old 65nm, and they don't want to release the new parts since they can't even sell the old ones. So they are sitting on mounds of GPU's they can't sell, and we are in a recession, very bad news for Nvidia.

Rethink this, and maybe you'll come up with the correct scenario.

I have been thinking this, and having real difficulty explaining it at all...

Look here the expreview article:
http://en.expreview.com/2008/1...5nm-gt200-exposed.html
Suposedly 55nm Gt200 product date week 33 2008 (ie august 11-17)
They were producing 65nm parts only a few weeks before:
http://www.techpowerup.com/rev.../images/front_full.jpg
GT200 65nm week 27 2008 (ie june 30 - july 6)
And also afterwards(apparently from an XFX card):
http://img201.imageshack.us/im...431695570480558xy8.jpg
GT200 65nm week 35 2008 (ie august 25-31)

Why continue to respin and produce 65nm cards after 55nm production has started?

Can think of a few possbilities:
1) The expreview picture is a fake.
2) The expreview picture is a of a sample chip, the production run hadnt started yet.
3) Cannot get sufficent 55nm capacity to meet demand running 65nm in parallel.
4) ????

Discounting 1) the expreview people seem fairly confident, there have also been widespread rumors of 55nm gt200 soon. Discounting 3) as if demand was so great surely a 55nm would have shown up in cards by now.

That leaves 2) i guess. Can anyone think of another explanation?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
So what would a 55nm chip do for me? I could care less about power consumption as I use my PC for gaming and not to impress some environmentalist that I'm "green" (no pun intended).
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Originally posted by: cmdrdredd
So what would a 55nm chip do for me? I could care less about power consumption as I use my PC for gaming and not to impress some environmentalist that I'm "green" (no pun intended).

Lower prices, for one. And even though power consumption does help with being "green", it also means that you can possibly get by with a lower RPM fan so your card will be quieter. In addition, lower power consumption can give card manufacturers more headroom to increase clock speeds. So you could end up with a faster card. Also, minor tweaks can also accompany a die shrink, so you may get better performance there as well.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Creig
Originally posted by: cmdrdredd
So what would a 55nm chip do for me? I could care less about power consumption as I use my PC for gaming and not to impress some environmentalist that I'm "green" (no pun intended).

Lower prices, for one. And even though power consumption does help with being "green", it also means that you can possibly get by with a lower RPM fan so your card will be quieter. In addition, lower power consumption can give card manufacturers more headroom to increase clock speeds. So you could end up with a faster card. Also, minor tweaks can also accompany a die shrink, so you may get better performance there as well.

Then again, the current 65nm GTX 260 Core 216 use less power than the 55nm 4870s, so there are apparently other factors in play as well......

http://techreport.com/articles.x/15651/11
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Creig


Both AMD and Nvidia have R&D costs, they both have overhead, they both pay for marketing.
Yes but one company is making a profit while the other is not, one company is worth more than the other.


In fact, I would imagine a 1.4 billion transistor GPU would cost more in R&D than a 965 million one.
I can imagine a blue dragon eating planets made of marshmallows. Does that make it a fact?

Plus, I'm pretty sure Nvidia spends more on advertising than AMD.
I'm pretty sure the moon is made of cheese.
This would make Nvidia GPUs even MORE expensive compared to their AMD counterparts. So in your attempt to prove me wrong, you're actually driving my point home. Thanks!
Your point being that you have not one shred of evidence?
I'm not sure what your wisecrack "reality" comment is supposed to mean, however.

I bet you don't. In "reality" you have to have some facts to back up your statements. But in reality you have no idea how much either company spends on napkins for it's cafeteria let alone it's GPU production. You are making wild speculation based on your years of experience reading the internet, which amounts to.... well nothing really.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Creig
Originally posted by: keysplayr2003
Bold 1: See where your head is at? I attempted to "get" him to come up with a figure.
What really happened: I asked him because he sounded like he knew, and I didn't.

Calm down, keys... All biostud said was:

It will still be more expensive to produce than the RV770 core, but more competition is always good.

I'm not sure how you equate that to "he sounded like he knew, and I didn't". Perhaps he did. I simply expounded on his statement and explained in greater detail. From your statement it seems you were interested in the answer, so I attempted to explain it to you. Isn't that what you were after? An answer?



Originally posted by: keysplayr2003
Bold 2: What are you talking about? GTX 260 core 216 is even cheaper than a 48701GB card right now, and some offers include bundled FarCry2! If what you say is so true, then AMD should be able to offer their 4870 at a cheaper price with a bundled game too? Right?

Yes, they should. For a while, the 4870 was cheaper than the GTX260. Now Nvidia has dropped the price even lower. Some people are speculating that Nvidia is attempting to dump stock of 65nm GTX260s to make way for their new 55nm 260s. It's possible this is simply a fire sale in order to just get rid of them. Maybe it isn't. I doubt anybody here is privy to that information, so all we can do is guess.



Originally posted by: keysplayr2003
Bold 3: How does this matter to you? Me? Any end user?
What in the world is this infatuation with company costs? As long as YOU, the end user pays a reasonable price for a certain performance segment, that is where your infatuation should end. Because if it doesn't end, that means that you care more about the company than you do the hardware. Which does not make sense.

If it doesn't matter to you, then why on earth did you ask the question in the first place?

I just wanted numbers Creig. "It will still be more expensive" to me, sounded like he knew.
And maybe next time I ask someone besides you a simple question, maybe you should refrain from being helpful and answering for them. At least when I ask the question. You have a permanent animosity towards me and it's starting to become tiresome. Your posts are out of left field the majority of the time, leading me to believe you are on an entirely different basis for thought, instead of the subject matter. More power to you.

BioStud. I guess you don't know what they cost to manufacture. Ok, that is fine.
Now we can resume with our regularly scheduled program. All this drama for a simple "I don't know" answer.

 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Originally posted by: Wreckage
Originally posted by: Creig


Both AMD and Nvidia have R&D costs, they both have overhead, they both pay for marketing.
Yes but one company is making a profit while the other is not, one company is worth more than the other.


In fact, I would imagine a 1.4 billion transistor GPU would cost more in R&D than a 965 million one.
I can imagine a blue dragon eating planets made of marshmallows. Does that make it a fact?

Plus, I'm pretty sure Nvidia spends more on advertising than AMD.
I'm pretty sure the moon is made of cheese.
This would make Nvidia GPUs even MORE expensive compared to their AMD counterparts. So in your attempt to prove me wrong, you're actually driving my point home. Thanks!
Your point being that you have not one shred of evidence?
I'm not sure what your wisecrack "reality" comment is supposed to mean, however.

I bet you don't. In "reality" you have to have some facts to back up your statements. But in reality you have no idea how much either company spends on napkins for it's cafeteria let alone it's GPU production. You are making wild speculation based on your years of experience reading the internet, which amounts to.... well nothing really.

I never claimed to have concrete numbers, did I? Since neither company has released the information, I simply took the snippets of information that have leaked out regarding both GPUs and made a reasonable sketchwork on AMD vs Nvidia GPU cost ratios.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: nRollo
Originally posted by: Creig
Originally posted by: cmdrdredd
So what would a 55nm chip do for me? I could care less about power consumption as I use my PC for gaming and not to impress some environmentalist that I'm "green" (no pun intended).

Lower prices, for one. And even though power consumption does help with being "green", it also means that you can possibly get by with a lower RPM fan so your card will be quieter. In addition, lower power consumption can give card manufacturers more headroom to increase clock speeds. So you could end up with a faster card. Also, minor tweaks can also accompany a die shrink, so you may get better performance there as well.

Then again, the current 65nm GTX 260 Core 216 use less power than the 55nm 4870s, so there are apparently other factors in play as well......

http://techreport.com/articles.x/15651/11

Well, the 4870 uses some special digital PWMs too. I don't know any technical details. Perhaps there's some stuff going inside the 4870 that just isn't being used when it's on. Sorta like running on 8 cylinders and only using 4 of them. You're still sucking gas for 8.
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Originally posted by: keysplayr2003
I just wanted numbers Creig. "It will still be more expensive" to me, sounded like he knew.
And maybe next time I ask someone besides you a simple question, maybe you should refrain from being helpful and answering for them. At least when I ask the question.

If you don't want anybody else but a single person to answer your question, perhaps it would be better if you simply PM'ed them. This is an open, public forum after all. If a question is posted, anybody is free to answer it.


Originally posted by: keysplayr2003
You have a permanent animosity towards me and it's starting to become tiresome.

That's rather hypocritical coming from a guy who just told me:

Originally posted by: keysplayr2003
See where your head is at?


Originally posted by: keysplayr2003
Your posts are out of left field the majority of the time, leading me to believe you are on an entirely different basis for thought, instead of the subject matter. More power to you.

You asked a question regarding costs and I tried to answer it for you. Then you claimed that you didn't care to know the answer in the first place. I don't see how that makes me "out of left field".
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Creig

I never claimed to have concrete numbers, did I? Since neither company has released the information, I simply took the snippets of information that have leaked out regarding both GPUs and made a reasonable sketchwork on AMD vs Nvidia GPU cost ratios.

In other words you simply made up the numbers.
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
Originally posted by: Creig
Originally posted by: cmdrdredd
So what would a 55nm chip do for me? I could care less about power consumption as I use my PC for gaming and not to impress some environmentalist that I'm "green" (no pun intended).

Lower prices, for one.

Lower prices maybe for the companies producing the boards but I SERIOUSLY doubt the 55nm variants will be priced lower than the 65nm ones currently are for us consumers.

 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
Originally posted by: Wreckage
Originally posted by: Creig

I never claimed to have concrete numbers, did I? Since neither company has released the information, I simply took the snippets of information that have leaked out regarding both GPUs and made a reasonable sketchwork on AMD vs Nvidia GPU cost ratios.

In other words you simply made up the numbers.

1.4 billion and 900+ million transistors is not made up is it? In "reality" Creig's sketchwork and rjc's calculations make sense...in reality that is...and are the best estimations you can make based on the limited information available.

No one here seems to know ACTUAL costs (to nV and ATI) but I'm willing to bet most would say it costs more to make the 1.4billion transistor GPU and they'd probably be right...in reality of course.
 

Spike

Diamond Member
Aug 27, 2001
6,770
1
81
Dang, the everyone here needs to calm down a little. Someone made a statement that the nVidia chips cost more to make and everyone gets all in a tizy. I thought Creigs statements were mostly to the point and his assumptions, while backed up with speculation and internet rumors, still looked to have been logically obtained. rjc did an even better job showing how, in theory, it can be calculated that nVidia's chips cost more to make than ATI's. Is this an absolute fact that we can prove? Not really but based on the little info we have it can be assumed.

Ok, that being semi-established ;) why are some here getting their panties in a wad? Why would you guys care if the GTX 260 cost to make is eleventy billion vs. 38 cents for the 4870 if in the end the 260 costs less? As long as your favorite company is making money, and both are (if you look just at the gpu division of ATI) and they are offering what you want for a price you are willing to pay then we are all good.

I even saw someone comparing GTX 260 prices to the 4870 and, it looked to me, like they where somehow using this as proof in the cost of the chip assumption. What does the end price of a product have to do with it's cost to manufacture? In some cases alot and in others not so much. Look at the Wii vs the Xbox 360. You can now get an Xbox 360 for cheaper than a Wii (with coupons) yet which do you think costs more to make? The Xbox 360 by a fair bit due to the complexity of the system as a whole and the (relatively) newer parts. The only reason the Wii costs $250 right now, the same price it came out at, is because the demand is still there. Why would Nintendo drop the price of the system when it is still selling like hot cakes? Same thing for ATI, sure the GTX260 216 gives the 4870 1GB a run for it's money and is arguably the better card (from what I have seen it is). However, if the 4870 is still selling at ATI's targeted rate why would they drop the price or throw in a free game? As soon as the 260 starts to impact sales enough and they cross whatever invisible line ATI has drawn you can expect the prices to come down. In the mean time they are getting as much profit as they can.

Anyway, back to the original topic at hand: I have no idea what nVidia is thinking. The only barely-logical reason to stockpile inventory is to clear out the old stock first so it does not impact sales of the newer, better version. However, after years of IE (industrial engineering) classes and experience in the workplace as an IE one of the things you learn is stockpiling inventory is never a good idea, especially in a recession. It's all about the just in time system and making sure you only make barely enough to satisfy demand, especially in a market as volatile as the gpu one is. Otherwise you are sitting on a product that has cost you money to make and is costing you money to store yet gains you nothing. All the while it's actual value to you may be shrinking... definitely an odd choice.

**Edited for name calling bordering on flaming, thanks for the heads up Virge and sorry if I offended anyone**

 

Slugbait

Elite Member
Oct 9, 1999
3,633
3
81
Spike, I like the preamble, and thanks for bringing the thread back on topic.

Originally posted by: Spike
Anyway, back to the original topic at hand: I have no idea what nVidia is thinking. The only barely-logical reason to stockpile inventory is to clear out the old stock first so it does not impact sales of the newer, better version. After years of IE (industrial engineering) classes and experience in the workplace as an IE one of the things you learn is stockpiling inventory is never a good idea, especially in a recession. It's all about the just in time system and making sure you only make barely enough to satisfy demand, especially in a market as volatile as the gpu one is. Otherwise you are sitting on a product that has cost you money to make and is costing you money to store yet gains you nothing. All the while it's actual value to you may be shrinking... definitly an odd choice.

Let's start with Windows as an example: if it's not done by September, kiss Christmas good-bye. OEM systems have to be on the shelves. XP was an extraordinary amount of cooperation between MS and OEMs, so it was a little different for that release.

So nVidia has just a smattering of GPUs ready by September. Now we have two problems:
a) Supply and demand: who gets the supply?
b) Supply and demand: who purchases the supply?

Another thing to address: they aren't clearing out old stock first. That stock will be replenished for as long as their former top-of-the-line chips will suffice as mid-range performance, and later "value" purchases. Sad example.

Board manufacturers have a ramp up period for building and testing, designing shelf boxes, marketing materials, magazine advertisements, etc. OEMs want those boards. But because the chips are coming in so late, nobody is ready for the Christmas shelves. Nor should they be...if official info about the chip were to be leaked, and then God forbid issues cause it to slip, good-bye Christmas to everyone: we all wait until Q1 2009 to make our purchases. Kills both nVidia and ATi sales, as well as OEM box sales.

On the flip side, if it does ship on time, and the manufacturers are ready, how many boards get into the mainstream? Now we have a very disgruntled Christmas, at least nobody hanging out on this forum is going to spend money, so again it kills both nVidia and ATi sales, as well as OEM box sales...we're waiting for supply to catch up to demand, there is no reason for them to drop prices on existing cards, the new cards are astronomically priced, etc.

So if this story is true, my guess is they recognized they can't meet anyone's demands, OEM or consumer, without cannibalizing everyone's Christmas wishes. Next best strategy is to have a happy Christmas for all, including AMD, then flood the market with what they think is the AMD killer, and totally rule over the next holiday season.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Creig
Originally posted by: keysplayr2003
I just wanted numbers Creig. "It will still be more expensive" to me, sounded like he knew.
And maybe next time I ask someone besides you a simple question, maybe you should refrain from being helpful and answering for them. At least when I ask the question.

If you don't want anybody else but a single person to answer your question, perhaps it would be better if you simply PM'ed them. This is an open, public forum after all. If a question is posted, anybody is free to answer it.



Originally posted by: keysplayr2003
You have a permanent animosity towards me and it's starting to become tiresome.

That's rather hypocritical coming from a guy who just told me:

Originally posted by: keysplayr2003
See where your head is at?



Originally posted by: keysplayr2003
Your posts are out of left field the majority of the time, leading me to believe you are on an entirely different basis for thought, instead of the subject matter. More power to you.

You asked a question regarding costs and I tried to answer it for you. Then you claimed that you didn't care to know the answer in the first place. I don't see how that makes me "out of left field".

You see and don't see a whole lot at the same time. It's called selective comprehension.
Anyway, I'm gonna say "when" to this utterly pointless drama. Don't want to see the thread locked up.

Apologies to the thread participants for having to witness this bile.

 

WelshBloke

Lifer
Jan 12, 2005
30,443
8,109
136
Originally posted by: keysplayr2003
Originally posted by: Creig
Originally posted by: keysplayr2003
I just wanted numbers Creig. "It will still be more expensive" to me, sounded like he knew.
And maybe next time I ask someone besides you a simple question, maybe you should refrain from being helpful and answering for them. At least when I ask the question.

If you don't want anybody else but a single person to answer your question, perhaps it would be better if you simply PM'ed them. This is an open, public forum after all. If a question is posted, anybody is free to answer it.



Originally posted by: keysplayr2003
You have a permanent animosity towards me and it's starting to become tiresome.

That's rather hypocritical coming from a guy who just told me:

Originally posted by: keysplayr2003
See where your head is at?



Originally posted by: keysplayr2003
Your posts are out of left field the majority of the time, leading me to believe you are on an entirely different basis for thought, instead of the subject matter. More power to you.

You asked a question regarding costs and I tried to answer it for you. Then you claimed that you didn't care to know the answer in the first place. I don't see how that makes me "out of left field".

You see and don't see a whole lot at the same time. It's called selective comprehension.
Anyway, I'm gonna say "when" to this utterly pointless drama. Don't want to see the thread locked up.

Apologies to the thread participants for having to witness this bile.

We'll forgive you if you promise not to do it again. :p

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: WelshBloke



We'll forgive you if you promise not to do it again. :p

I can do that. But I can only speak for myself, unfortunately. ;)
 

rjc

Member
Sep 27, 2007
99
0
0
Expreview has put up a picture of zotacs new GTX260 board for the 55nm GT200s:
http://en.expreview.com/2008/1...nm-geforce-gtx260.html

The P654 10 layer board replaces the original P651 14 layer board. New power module, and memory has been moved all to one side of the board using samsung chips.

From the look of the card that is where their original story came from.

Interestingly there is a note half way down that availabilty of the volterra chip(power module) was limiting mass production.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
http://en.expreview.com/2008/1...as-geforce-gtx295.html

GTX295 = GX2

Dual-GPU designed GeForce GTX260 GX2 graphic card will be officially named as GeForce GTX295. Apparently, GeForce GTX295 is coming to regain its dominance of performance which has been grabbed by AMD Radeon HD4870X2. It will utilize two GT200 of 55nm with 216 stream processors, and probably carry on dual PCB design. As of now, the frequency and memory are not known yet.

rose.gif
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: rjc
It is generally quoted that a single 300mm 65nm die from TSMC costs around $5000. TSMC has lots of customers so i guess over time the info has leaked out somehow. 55nm is more expensive but hopefully not too much.

So for a back of the napkin calculation:
Total die area = pi * (300/2)^2 = 70685mm2
RV770 die size = 256mm2 => max 276 dies per wafer
GT200 die size = 576mm2 => max 122 dies per wafer

A more accurate count of dies per wafer could be obtained by finding a wafer shot of the relevant chip and counting the number of dies present.
And that's the problem, all of these assumptions about die price are based off severely flawed assumptions. You set a static die price per wafer but ignore any price differences due to:

1) Volume
2) Process

Volume discounts are commonplace in any industry and NV is one of TSMC's top partners, dwarfing ATI's orders.

Also, you can't seriously think a wafer costs the same regardless of process? New processes are going to cost more per wafer as companies with high capitalization need to offset depreciation with higher revenues on new production lines. Given the amortization of these assets is going to be greatly accelerated due to the nature of the industry, older processes will be significantly cheaper than newer ones.

This all leads to utilization, which is the most important consideration for TSMC. If Nvidia keeps their older process highly utilized, they make more money than if Nvidia cuts back orders.

Originally posted by: Creig
I never claimed to have concrete numbers, did I? Since neither company has released the information, I simply took the snippets of information that have leaked out regarding both GPUs and made a reasonable sketchwork on AMD vs Nvidia GPU cost ratios.
Rofl you've been offering up the same "reasonable sketchwork" for years and have continually been proven wrong.

  • You and others said the same about G80 using the same flawed logic as above, too big, too expensive, low yields, can't make a profit.

    Result: Nvidia manages to sell $400 and $300 versions of G80 and posts record profits.
  • You and others said the same about GT200, too big, too expensive, low yields, can't make a profit.

    Result: Nvidia drops prices to $500 and $300 still manages to turn a profit in a down economy. Recently drops their GTX 260 prices even lower, closer to a $200 price point
In the meantime, AMD still has not turned a profit since acquiring ATI. These are facts.