As I wrote, it was a rough estimate. Your considerations above are valid. But do they compensate enough for the fact that the price per transistor to the consumer is about 6-7x higher than it should be if the cost reduction progress from Moore's law was passed on to the customer? I have a hard time seeing how that could be the case...
Intel R&D Expenses
Lets assume that it takes two years to develop both the process and the cpu.
Expenses in the lead up to Haswell: 20.169 Billion USD
Expenses in the lead up to Penryn Quads: 11.628 Billion USD
Research and Development expenses have approximately doubled, (assuming they haven't started supplying champagne, caviar and harlots for their engineers). Did the number of processors shipped during that time also double to maintain a similar cost per CPU?
Lets assume for the moment that they did. We have then demonstrated a significant cost (22% of gross sales in 2013) that doesn't scale with the improvements of production cost that Moore's law brings.
But lets suppose that Intel do as you said and reduce their prices by a factor of 6 (to 16%). Compare the two percentages. Intel would be paying more per chip for R&D than they would gain in revenue, even if you assume sales & marketing, customer support and even production costs were free.
You might argue that they would sell more chips than would otherwise have, but would there be enough to overcome the other costs that I mentioned?
Yes Intel make money. No, they do not gouge as badly as you suggest. If you want a company that operates on a revenue-neutral basis, try AMD.
