What your wall of text seems to not get is that AMD was selling the R9 390 at the price-points the RX480 8GB is selling at. So to re-cap:
1.)438MM2 GPU
2.)8GB of GDDR5 accross a 512 bit memory bus
3.)Bigger coolers due to much higher TDP
4.)Much more complex PCB due to 512 bit memory controller
5.)More complex power distribution circuitry.
The 232MM2 in Polaris 10 is nearly half the size of the GPU in the R9 390,the RX480 uses half the number of memory chips,and has a less complex PCB and requires less cooling. If anything from what I remember reading here,a lot of that increase was apparently the R9 380 and R9 380X.
Even a 50% increase in cost per transistor would still make Polaris 10 based cards cheaper to make than Hawaii or Tonga based ones . AMD has WSA too,so for them to fulfill it with CPU sales not being so hot,at least helps more in that regard longterm. If anything from what I remember reading here,a lot of the marketshare increase was apparently the R9 380 and R9 380X(could be wrong there). But as you know the R9 380 and R9 380X are using 359MM2 GPUs too and tend to be generally higher in power consumption and cooling requirements,which increases costs.
The RX480 and RX470 have also replaced the R9 380,R9 380X and R9 390. The whole AMD £150 to £300 price-range is now only served by one GPU instead of two.
But OFC as I mentioned earlier,you and your mates spin away that AMD increasing shipping marketshare is a bad thing. They have simplified their whole range whilst making the replacement cards cheaper to make,and at the same time making sure they can try and get more wafers used at Global Foundries longterm.
Also my current card is a GTX960 and my previous one was a GTX660TI,so not sure what the whole point of you bolding that sentence is for.
It is speculative in that it talks about the first 16ff process from TSMC that (to my knowledge) didn't see broad adoption and was quickly replaced by the more attractive 16ff+ and several deviations on that. This should be another hint that something major has changed in the cost relation.
By the way, just stumbled over this image:
http://imgur.com/eezbRGE
That would indicate a healthy decrease of cost per million transistors at GF.
Yes, that is a competitive disadvantage for GF, but they would be crazy to pass that down to their customers.
As for yield curves, they seem to be okay for both foundries. We would have heard more from unhappy customers otherwise, like with yield issues of 20nm process. GPUs are also rather well suited for binning and there is a large enough performance gap between the 470 and 460 to fit another China-only bin in (for example).
Tonga is a beefier PCB compared to Polaris:
Tonga
Polaris
More phases, 50% larger PCB, more secondary components. Almost same amount of transistors. And it needs a beefier cooler. I'm quite sure that Polaris beats its parts and assembly cost by a decent margin.
You have no actual numbers but you somehow think that you are correct and AMD doesn't know how to price their own products to make money?
45nm is normalized to 1, everything else is in relation to that. Yield has no place in basic cost per million transistors analysis, that's a factor with individual chips.As far as that chart goes, it doesn't explain the mathematics on how they got those numbers, but as I said earlier, once potential source of savings is yields. And how both companies determine this, could change this factor.
However that chart hardly shows a healthy savings between 40 and 28nm, there is a 45% savings vs a 12% savings. Even the chart notes the savings is greatly reduced at finfet.
The Fury X has watercooled VRMs and AMDs choice makes perfect sense. You also stopped viewing the video right after that sentence I assume. And a high rating on a MOSFET says little about its bulk price. You are grasping at some very weird straws here.And one more thing to note about the card, is although the PCB is smaller, the power circuity is crazy good for reference. The PCB is dense and is incredibly overbuilt. It has 6 power phases for the chip, where the one on the 290x has only 5 phases . It it is meant to handle more power than a titan PCB. Look at the caps, VRMs and power phases as explained by the video, and you will see that although the PCB is small, it is not a cheap one.
The 290x reference pcb can handle some 500 amps of current. The rx 480 can handle 600 watts of current. Some of the components are even better than the fury x.
This is one more thing that identifies that this was not meant to 200 dollar card. If AMD could have cut costs anywhere and increased its margins it would have been the PCB.
If AMD knew they were going to price this card at below 200 dollars, they would have given it a cheaper board.
Well i took a look at August Steam hardware survey, and the GTX1060 share is almost twice of the RX480 share already, kinda impressive as the 1060 came out later and it should have less volume avalible. Since there is no reason to belive that AMD users avoid Steam, im petty sure this increase in global AMD market share is mostly related to miners, and its not a gaming related thing.
http://store.steampowered.com/hwsurvey/directx/
45nm is normalized to 1, everything else is in relation to that. Yield has no place in basic cost per million transistors analysis, that's a factor with individual chips.
As for the savings, even a small reduction screws your earlier cost analysis and conclusion.
The Fury X has watercooled VRMs and AMDs choice makes perfect sense. You also stopped viewing the video right after that sentence I assume. And a high rating on a MOSFET says little about its bulk price. You are grasping at some very weird straws here.
Also, ever since AMD took over ATI, AMD has somewhat overbuilt its reference boards regardless of price and phased them out early in the GPU lifecycle in favor of custom boards. But this time it's an indication that AMD released a bad product? I have a hard time following that reasoning since most sales won't reflect your argumentation basis.
Btw fun fact, we haven't heard about bad return rates of AMD products in a long time now (apart from the initial Fury X batch), ten years ago this used to be a standard staple to trash talk ATI products.
You're telling me that parametric yield modeling was applied to one table differently than the other? Prove it. That could cause a ruckus. {edit in case this isn't clear, parametric yields aren't going to help you in cost analysis, especially since GPUs are a bit of an odd case thanks to binning. It's better to take them as they are and assume that everything has been taken into account. A company that lies about them would be in deep trouble with investors)parametric yields
I said Tonga has more phases, a larger PCB, more secondary components and probably a more costly cooler. I did not evaluate the performance of individual components, but the cost structure of the card.The AIB cards although costing more than reference pricing, taking a chopping block to the components compared to reference(not to mention you failed to address your previous statement where you said there was a downgrade compared to tonga, which I proved otherwise).
HD3850 / HD3870 192mm² die for 179/269$And the fact as I have continued to stress, if AMD intended to price the card from the get go this low, they would have used a smaller die and a cheaper pcb. Because this is cheapest price ever AMD has launched such a big die on a new node.
There is, actually, outside of the US.There should be larger than a 28% price difference between a card that performs 70% better.
Now that we have well-established that this data is rather uninformative regarding sales, it's best to wait until actual sales volume numbers are released sometime after this current quarter.
Well i took a look at August Steam hardware survey, and the GTX1060 share is almost twice of the RX480 share already, kinda impressive as the 1060 came out later and it should have less volume avalible.
It's the best selling VGA @ Amazon US at the moment. RX 480 is at #25.
Doesn't matter unless you can prove more people with AMD cards opt out over people with Nvidia cards. There's still plenty of people opting in so for gaming it will give an accurate split of gpu usage.What? Lots of gamers are not participating in HW survey, including me. So no, Steam is NOT a relevant measure at all.
Doesn't matter unless you can prove more people with AMD cards opt out over people with Nvidia cards. There's still plenty of people opting in so for gaming it will give an accurate split of gpu usage.
I was just thinking that... who cares if it's for gamers or used as a coaster? Pretty sure the money is still green?This is maybe the 8th time that nVidia partisans have tried to shift a discussion about sales numbers to one about sales specific for gamers.
It's red for AMD!I was just thinking that... who cares if it's for gamers or used as a coaster? Pretty sure the money is still green?
I was just thinking that... who cares if it's for gamers or used as a coaster? Pretty sure the money is still green?
On top of this, you have to accept that since the broken steam hardware survey does not distinguish XF/SLI from single dGPUs, there really is no meaningful data here regarding actual units owned. It can show in its own broken, casual system that a greater percentage of users have nVidia hardware in their boxes compared to AMD, but it doesn't show the actual unit #s from user to user in a real comparison.
Think about this: how many owners tend to buy 2x AMD cards and run them XFire for a performance/$ gain over the similarly-tiered or greater-tiered nVidia card? Likewise, plenty of nVidia users SLI lesser cards to gain performance parity with higher tiered nVidia cards.
For the dozenth time: steam hardware survey doesn't tell us much of anything about actual sales. Speculation from review sites doesn't tell us anything about sales. Only quarterly reports from the actual companies will tell us anything.
Think about this: how many owners tend to buy 2x AMD cards and run them XFire for a performance/$ gain over the similarly-tiered or greater-tiered nVidia card? Likewise, plenty of nVidia users SLI lesser cards to gain performance parity with higher tiered nVidia cards.
The steam HW survey will give you a fairly accurate of the relative ratio of 480s vs 1060s in the gaming market. It will not give you the marketshare per quarter (the survey only gives cumulative numbers) and it will not give you the total number of cards being sold (non-gamers and mining are not included). However, as there is NO statistical reason for Radeon owners or GeForce owners to opt in/out in significantly different proportions and there are NOT A STATISTICALLY SIGNIFICANT AMOUNT OF DUAL GPU OWNERS, especially for mid range cards like the 480/1060, it will give a decent picture of the cumulative relative marketshare in the gaming market.
God, this forum keeps whining about SLI/CF and it not being reported while not realizing that the number of SLI/CF midrange systems is astonishingly low (<1%) in proportion to non CF/SLI systems.
Basically nobody outside of hardware forums and high end enthusiasts run multiGPU. If you think its any significant number you have been spending too much time in the forums.
I realize that multi GPU is niche (I Don't run it and I've never cared to), but it would be worth quantifying this claim, no? There is a significant overlap of miners who game, and run mining and steam on the same machine. These are people with multiple pairs of GPUs in that one system (more often than not AMD), and they report steam's inaccurate recording of their hardware. Until we have an actual quantifiable number of just how niche this part of the market is, I will continue to speculate along the lines of how a single miner with 4-8x+ as many cards as the more maintstream single-card gamer is pretty much going to account for that discrepancy with his one data point. Too bad the survey doesn't work so that we could actually get real numbers for these situations.
All it allows us to do is speculate. We can't draw out any real conclusions from this tool because it is unevenly broken across too many variable to provide any meaningful data (users tend to report inconsistencies with reporting AMD compared to nVidia...but there you probably have more speculation on top of the rest. Until such things are actually confirmed one way or the other, this tool remains wholly useless for this discussion)
Remember that one Russian? Then you post this in this thread:
The logic makes no sense, huh? Odd considering it was your own logic just before the launch of Polaris.
The consensus among people here was that the RX 480 would launch at a price of $250 to $300, which was acceptably below the $350 price of the 970 at the time. Prices ended up tanking to clear out inventory, but that is a temporary market condition that was irrelevant as far as new product launch prices go. Car makers don't drop the list price of 2017 models because 2016's are being sold well below MSRP to clear out inventory.
At the end of the day, it's actually shocking to see so many NV fans want AMD to fail, want AMD to not gain market share, and yet they later claim that they want competition from AMD for better prices, faster dGPUs in the market. It makes no sense whatsoever. Any objective and logical PC gamer would want at least 50/50% market share between NV/AMD for the most fierce competition in this space. The fact that NV still continues to have 70%+ dGPU market share is extremely unhealthy for our industry. I guess if some of you like paying $700 for upper-mid-range GPUs as a way to gloat about your "high-end" PCs, have $ invested in NV, work directly or indirectly for NV, or are just blind fans of a brand like cheering for a sports team, then I suppose you would NOT want AMD to compete.
Even on the mining topic, until NV GPUs became viable for mining, NV users constantly attacked mining as something negative. Clearly a loyalty/brand-attached response or complete lack of intellectual curiosity to use a PC for something new; learn something new. It's almost face-palming to read on here how RX 480's mining ability and mining sales is a negative for AMD because the more products AMD/NV can sell or whatever reasons, the more viable the GPU-making business is for the future. If I know I can keep buying AMD/NV GPUs with a huge mining subsidy, then AMD/NV are getting my $ every generation. No one here criticizes NV for selling bucket-loads of GPUs for CUDA, distributed computing, rendering applications that most of us don't use. So why is it then that I read how "most AMD cards sold are to miners," so unless these GPUs show up on Steam ASAP, they are worthless market share gains?
The is a false equivalence. Bashing AMD != want AMD to fail.
total pc gamers in world: 1.25 billion in 2014It's not a random sampling. It doesn't accurately identify hardware in users systems. It only tracks the hardware of those users that do not have certain IGPs, do not have multiple GPUs, and that volunteer their data. It is great for that demographic, if that appeals to you. It is useless for tracking sales data, which people seem to think it can accurately reflect. You don't have to have much experience with data to understand how poor a metric this is for what many users here seem to think it is.
Of course I remember and I stand 100% by that statement again. It seems you actually didn't understand the statement or the context of members on here predicting RX 480 pricing prior to its launch.
I outlined the $249-299 price range and suggested that as long as RX 480 was within 25% of $399 1070's performance, it would still have superior price/performance & thus market viability. How so? Because $399 1070 / $299 RX 480 = 33% more expensive but if RX 480 came within 25% of 1070's performance, even at $299, it would still be a great value. The reason I also provided the lower range going down to $249 was because there was also a real possibility that RX 480 would be nowhere close to "within 25% of GTX1070's performance." I just never typed that out in that particular paragraph since for months I talked about a scenario where RX 480 would only be at R9 390/X level of performance and no more.
This is why I gave a range of $249-299 because the rumours for RX 480's performance ranged from around R9 390 all the way to Fury X level of performance. 1070 is 13% (1440p) and 19% (4K) faster than Fury X at TPU.
What you also failed to mention by calling my post a "gem" was that there were 2 additional rumours flying before RX 480 launched. First, there was a rumour that RX 480 series could launch in 2 flavours: 2304 and 2560 SP. Second, there was a rumour that RX 480 would overclock to 1.5-1.6Ghz and could come with up to 1.4Ghz boost out of the box. On that basis, I even came up with "within 25% of GTX1070's performance" at $299 as the high-range.
If you recall my posts, I personally believed that RX 480 would be closer to R9 390/X level of performance. Since I provided 2 distinct possibilities prior to RX 480's launch specs/performance, you could rightfully criticize my posts for suggesting RX 480 could come with 2560 shaders or approach Fury X performance at $299. You could also rightfully criticize my lower performance estimate (~ R9 390/390X) and $249 price level since AMD actually launched the RX 480 for $239. I was $10 off on my lower end estimate (if RX 480 was to come in around R9 390 level of performance).
Yes, my logic makes perfect sense since we found out that RX 480's full die was 2304 shaders, not 2560 shaders. We also found out its boost clock was only 1266mhz and its maximum overclocking was around 1350-1400mhz, nowhere close to the earlier rumoured 1.5-1.6Ghz on air numbers. Since it was the PC gaming community, not AMD, that incorrectly estimated RX 480 could end up anywhere between R9 390 and Fury X, it stands to reason that the $249-299 range that was provided by many posters was made under the assumption of different performance ranges (unless you think we suggested RX 480 would have Fury X level of performance for $249....).
That's right, but what you and others in this thread aren't acknowledging is that the $250-300 pricing range estimates posted by various members was tied to differing hypothesized levels of performance that RX 480 could have arrived at. You aren't connecting the dots why those range estimates were made in the first place. How obvious can it be when those ranges were made that it would have been stupid to even estimate that AMD would replace a $329 R9 390 with a $299 RX 480 because:
1) It was not aligned with bringing R9 290 level of performance to the masses (i.e., substantially below $349);
2) It was not aligned with the upper-level estimates that RX 480 could beat R9 390X by 5-10% and approach Fury or even Fury X.
In summary, by you, and others, continuing to focus on the $250-300 range estimates that were made prior to RX 480 launch, you completely failed to account that those price range estimates came with performance caveats attached to the lower and upper pricing ranges. In other words, the price estimates were commensurate with differing level of RX 480's projected performance.
---
The rest of the thread has turned into AMD's profitability and gross margin analysis which has nothing to do with real world market share gains. It seems the earlier predictions that AMD would not go above 20% market share have been proven wrong, so the same posters have now moved on to discredit AMD's market share gains by claiming those gains are meaningless without profits -- this topic was never even in contention and is a separate discussion from market share gains.
Some other posters are implying that RX 470/480 sales to miners are irrelevant since they aren't sold to PC gamers. There are 3 major flaws with this criticism.
First, there is for sure a certain % of GTX1060/1070 cards also sold to miners but that's never discussed in the context of GTX1060/1070 sales numbers. Even if there are fewer 1060/1070 cards sold for mining, the fact that it's completely dismissed but AMD's market share is criticized for selling to miners is a double standard. After the W10 Anniversary update, it's also possible to make $ with 1060/1070.
Second, when miners find a better mining card and/or if mining no longer becomes profitable, they sell their mining GPUs. Who buys those GPUs? Most likely PC gamers. That means unless the RX470/480 cards fail and aren't replaced under warranty, the vast majority of RX 470/480 cards will eventually end up as gaming cards in the hands of a 2nd or 3rd owner.
Third, if we only count Steam market share numbers as viable, what about all the NV 1060/1070/1080/Titan XP cards sold for rendering, CUDA, distributed computing, YouTube/game streaming, etc.? Should these cards also be subtracted from NV's 70% market share then sine they aren't used 100% for PC gaming on Steam? What difference does it make if AMD sells the RX 480 to a miner first or if a 1060 is bought for games? If you guys want to discuss PC gaming optimizations based on prevalence of gaming GPUs within the PC gaming rigs, that's a completely different topic and has nothing to do with overall dGPU market share. Again, seems like a desperate attempt to discredit AMD's market share gains with trivial reasons.
Another major issue with Steam sales numbers is that often older generation cards (sometimes 5-7 year old cards) continue to show gains on Steam, despite the fact that these dGPUs are no longer sold in the market.
At the end of the day, it's actually shocking to see so many NV fans want AMD to fail, want AMD to not gain market share, and yet they later claim that they want competition from AMD for better prices, faster dGPUs in the market. It makes no sense whatsoever. Any objective and logical PC gamer would want at least 50/50% market share between NV/AMD for the most fierce competition in this space. The fact that NV still continues to have 70%+ dGPU market share is extremely unhealthy for our industry. I guess if some of you like paying $700 for upper-mid-range GPUs as a way to gloat about your "high-end" PCs, have $ invested in NV, work directly or indirectly for NV, or are just blind fans of a brand like cheering for a sports team, then I suppose you would NOT want AMD to compete.
What's even more funny is that the most vocal critics of AMD products on our forum over the last 6-7 years are not going to be an AMD GPU, period. So why do you waste forum space, people's energy, and time on bashing everything related to AMD?
This forum has so little technical knowledge and curiosity left, that instead of hundreds of PC gamers in our CPU section buying an i5-6400 and i7-6700 and overclocking them on $80-100 boards, we have daily trolling on every AMD CPU/GPU section. You guys think newcomers to PC/tech are interested in reading all the childish bickering about who has more market share in dGPU space?
It's FAR, FAR more interesting to read about how someone saved $ buying an i5-6400, a $15 Gammaxx 400, overclocked it to 4.6Ghz, and used the $ saved to move up from an RX 480/1060 to the GTX1070. This is what brings the community together, not bickering about how AMD could have priced RX 480 at $300 instead of $250. It's interesting how no one discusses how much gross margins Noctua or Corsair make on their cooling products, or how much SeaSonic or EVGA make on their Titanium PSUs. I guess the profitability and market share of those firms isn't fun to discuss, huh? It seems the vast majority of posts on this forum are pro-brand, rather than pro-PC building.
I browse other forums, and I read the same fanboism drivel. Typical no-brains involved recommendation for a PC build is some Asus Z170-A board, Arctic Silver 5, CM212 Evo (or the highest end Noctua NH-D15) and i5-6600K. For a technical forum, that recommendation requires 0 knowledge, 0 research, 0 skills. Adding value is outlining the issues that Asus Z170-A has despite it being the #1 best-seller of Z170 boards, how a $15 CPU cooler is now better than a CM212, and how there is absolutely nothing wrong with buying an i5-6400 either because games don't use AVX instructions. But no, I rarely read anything informed on here anymore, but rather 10-20 pages about wafer costs, gross margin projections, Steam # justifications, etc. -- all things that have nothing ALL to do with building a PC for someone who is 13-25 and wants to learn/get into PC gaming.
Didn't we join this forum to learn about the tech industry, learn from each other, share our experiences? It seems some who pretend to want competition aren't at all interested in the reality of what competition actually means. For example, once it finally became viable to mine ether with Pascal, RX 480 owners aren't bashing GTX1060/1070 for mining but rather welcome them as great alternatives. It also means with mining in place, we have a reasonable upgrade path via both AMD/NV cards and continue mining. Ironically the miners on this forum are actually pretty objective since they care more about mining/making $ with dGPUs than which brand has higher market share on Steam.
Even on the mining topic, until NV GPUs became viable for mining, NV users constantly attacked mining as something negative. Clearly a loyalty/brand-attached response or complete lack of intellectual curiosity to use a PC for something new; learn something new. It's almost face-palming to read on here how RX 480's mining ability and mining sales is a negative for AMD because the more products AMD/NV can sell or whatever reasons, the more viable the GPU-making business is for the future. If I know I can keep buying AMD/NV GPUs with a huge mining subsidy, then AMD/NV are getting my $ every generation. No one here criticizes NV for selling bucket-loads of GPUs for CUDA, distributed computing, rendering applications that most of us don't use. So why is it then that I read how "most AMD cards sold are to miners," so unless these GPUs show up on Steam ASAP, they are worthless market share gains?