• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[fud] AMD Radeon 7770/7750 coming in February on 28nm process - specs/prices

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
😵
The halo card of any generation has never been a good performance/dollar proposition.

99% of us know this. Almost everyone sees the broken argument. Folks just stop responding, as this point you're just encouraging it. Every time you point out the erroneous nature of the argument, it's just echoed to you again.
 
Making the wafer takes about a week, but that's just the start.

Taking the wafer and making the ciruits on them takes months. 8-10 weeks. But you make all those dies on the wafer at the same time. That is why the industry wants bigger and bigger wafers.

Then you have to verify/test, bin, package, and ship the chips.

The entire process can take 90 days, sometimes even longer. Rarely is it less than 6 weeks in a fully streamlined production environment (which is not what a new node can be called).

Eight to ten weeks is a huge timeframe just to push out one of those small biscuits. I read articles sayin setting up one of these facilities costs upwards of 10 billion. Makes it hard to see how they are making money on this, but I guess they are. 😎
 
😵
The halo card of any generation has never been a good performance/dollar proposition.

I don't know about "any" or "never" but over-all, I would agree to the premise.

I can make the point this is the first time in GPU history that the percentage MSRP increase over their past GPU -- was more than the actual percentage performance gain over-all. What makes it stand out even more this is done on a substantial and significant node change and a smaller die.
 
99% of us know this. Almost everyone sees the broken argument. Folks just stop responding, as this point you're just encouraging it. Every time you point out the erroneous nature of the argument, it's just echoed to you again.

Agree with this, I have yet to see much in the way of rational discussion in this regard. I'm not going to take you seriously if you:

1. Compare raw 7970 pricing to 6970

But

2. Compare performance pricing of the Radeon 7970 3GB to the GTX 580 1.5GB.

And

3. Ignore that there hasn't been much if anyone saying the 7970 is priced for value.


I personally don't understand where the need to repeat this flawed tripe is coming from.

Edit:
I don't know about "any" or "never" but over-all, I would agree to the premise.

I can make the point this is the first time in GPU history that the percentage MSRP increase over their past GPU -- was more than the actual percentage performance gain over-all. What makes it stand out even more this is done on a substantial and significant node change and a smaller die.


Hmmm, wonder what comparing the GeForce 5000 series to the 6000 series halo cards would look like. I should then say anything that doesn't match that $/performance ratio is an outrage.

AMD isn't operating in a vacuum, are you seriously suggesting it price the 7970 less than the GTX 580 simply to appease your sensibilities? I'd be more concerned with what Idontcare mentioned and that AMD and NVIDIA will go light on the pricing competition, and this time they won't pass notes to each other that can get them in trouble.
 
Last edited:
The reason for the past change in strategy, to me, was trying to obtain stronger market share. By redefining performance price-points and concentrating their resources where most consumers are may offer a vehicle to do this and in some respects actually worked by actually taking the over-all discrete market share away from nVidia not too long ago. One thing for sure to me; the move was disruptive.

Amd has gone in another direction and pricing more premium based -- let's see how it plays out over time.
 
Making the wafer takes about a week, but that's just the start.

Taking the wafer and making the ciruits on them takes months. 8-10 weeks. But you make all those dies on the wafer at the same time. That is why the industry wants bigger and bigger wafers.

Then you have to verify/test, bin, package, and ship the chips.

The entire process can take 90 days, sometimes even longer. Rarely is it less than 6 weeks in a fully streamlined production environment (which is not what a new node can be called).
I'm not sure just how long it is for Tahiti, but AMD did note that their 28nm designs take longer to fab than their 40nm designs. Which is why there was a longer gap between final production and the product launch with the 7970.
 
Eight to ten weeks is a huge timeframe just to push out one of those small biscuits. I read articles sayin setting up one of these facilities costs upwards of 10 billion. Makes it hard to see how they are making money on this, but I guess they are. 😎

They make money on the volume. The reason the fabs cost $10B is because they are scaled up in size to such an extent that they can push out 30,000-50,000 wafers per month (>1000 wafers per day 😱).

TSMC operates some of the largest fabs (so-called "Gigafabs") in the world, pushing out >100k wafers per month. That is a lot of wafers, and it takes a lot of money to buy enough tools that can operate in parallel such that the fab is pushing out 3000 wfrs every day, 7 days a week.

So it might take 70 days for the wafer to exit the fab, start to finish, but they are starting 3000 wafers every day, and 3000 are exiting the fab every day.

The volume of wafers that are in the fab being worked on is referred to as "WIP" for Work In Progress. Gigafabs can have a WIP of 250,000 wafers that are at various stages of the manufacturing process.

It really is a sight to behold if you ever get a chance to see the inside of a gigafab.
 
The volume of wafers that are in the fab being worked on is referred to as "WIP" for Work In Progress. Gigafabs can have a WIP of 250,000 wafers that are at various stages of the manufacturing process.

It really is a sight to behold if you ever get a chance to see the inside of a gigafab.


When I was younger I had a brief stint working in production planning. I am somewhat familiar with the WIP process, but probably not in the same context. For me it was relegated to tracking current WIPs via Oracle and looking for part shortages and planning part orders. I didn't see much of the actual production, these were large server frames, metal housings
and enclosures with some minor PCB additions such as backplanes.

I'd be interested to know if the WIPs for wafers are tracked and broken down in the same way to each small finite part in a bill of materials. Isn't the bulk of this production done with one material being laid onto the wafer ?

Also I'm sure it would be awesome to check it out. I saw a small video about a fab that GloFo was building somewhere in the States recently, not sure if it is finished now or not. This thing was vast, on the scale of an automotive production plant.
 
Last edited:
Also I'm sure it would be awesome to check it out. I saw a small video about a fab that GloFo was building somewhere in the States recently, not sure if it is finished now or not. This thing was vast, on the scale of an automotive production plant.

They just finished building one in Malta, NY. I think its called Fab 8 or something like that. I actually went to an interview for a mechanical engineering job there, but it turned out to be an overnight shift and I wasn't ready for something like that. I talked to the interviewer a bit about the whole thing, but he was talking a little over my head. He was on the manufacturing side of things, so he pretty much had to work to get the design figured out on a wafer sized scale. He said it usually takes around 6 months to go from concept to a manufactured chip. Was quite interesting to hear his side of things though, and he still referred to AMD's graphics division as ATi. 😀
 
Back
Top