exacly
there is no more volume for 1k Cpu's
sure, never had much, but now is zero
Well as pointed out you have Socket 2011, which is actually a far better proposition. It's like workstation, but on the cheap
exacly
there is no more volume for 1k Cpu's
sure, never had much, but now is zero
Just curious, did Intel mention how much inventory declined in Q4 yesterday?
Code:[B]Intels inventory Percentage of sales[/B] Q3 2011: $4 billion 28% Q4 2011: $4.1 billion 30% Q1 2012: $4.5 billion 35% Q2 2012: $4.9 billion 36% Q3 2012: $5.3 billion 39% [COLOR="Red"]Q4 2012: $?.? billion ??% [/COLOR] source:[URL="http://www.marketwatch.com/story/intel-bulls-should-hope-it-made-fewer-chips-2013-01-16?siteid=yhoof2"] Intel bulls should hope it made fewer chips[/URL]
The valid point is that for home use, PC's are just fast enough so the upgrade cycle has slowed. By PC's I also mean laptops. The average consumer doesn't buy a PC in a case.
But there is still the corporate side of things. Faster PC's are important for a lot of industries. And even in stupid places like call centers you'll have equipment upgrades, new pc's for new employees. After 3 years of constant use the hard drives are usually wrecked and slow so a business just buys a new box.
Then you have Asia, which has been saving Intel's bacon over the past 5 years.
They certainly are and I'm waiting for them to release a decent product into the mobile market real soon. The success however is dependent on its price, the ecosystem they choose to go with and lastly, performance.Intel will also slowly but surely enter the phone/tablet market. Funny reading these doom and gloom posts.
Intel's inventory situation is only going to get worse, if their flagship tablet chip is heavily binned Haswell.
It's a pain in the ass to debug software that is more and more multithreaded at the moment, hence why developers are not writing heavily multithreaded apps. Unless you want to impose martial law on software developers, they will do as they see best fit for their labors.Surely the problem with performance is easy to figure out?
Intel killed multi core computing by pricing their +4 core at the kind of prices that hardly anyone can afford. It suits their business to have smaller dies at the expense of ultimate performance. Don't blame the TDP cap - Ivy Bridge is 77W and could easily have been a ~3GHz octocore at 125W.
Don't blame the software either - AMD has been trying to go much more multithreaded for years and has been thwarted, simply because intel dominates the market in their position and it doesn't suit them for the software to be more multithreaded than it already is. If the desktop market is dying, it's because intel has strangled it for all it's worth as it attempts to maintain it's position. That's all their is to it.
If the software was available to make use of it, we could easily be looking at 16 or 24 core Kabini's next year. Whats the point when no software uses it and it'll get shot down for having poor single threaded performance in 2013?
I wonder what they are planning to do with their "old" 22nm factories.
In fact, it improved. Intel inventory finished at 4.7 billion levels, still above the beginning of 2012 but still better than the 5.3 billion they got in Q3.
We have plenty of sub-100W GPUs- my own HD 7770 is 80W. The option exists for people to use lower power consumption GPUs if they want to, the same way that they can still get a 130W i7-3930k if they want to.
Are you serious? Hardly anyone can afford a quad core? I have seen a lot of outlandish statements in these forums but this is right up at the top. Besides you contradict yourself. You say that hardly anyone can afford a quad but that they would buy them if the software was written better.
I wonder what they are planning to do with their "old" 22nm factories.
Why not public-domain those process flows and turn it over to academia or something so at least the next generation of engineers can study and learn from them?
They could have "reduced" the inventory with a 10% depreciation
wich is quite plausible given their recent low cost cpus dumping
that couldnt compensate for the yearly 1bn revenue shrink...
In fact, it improved. Intel inventory finished at 4.7 billion levels, still above the beginning of 2012 but still better than the 5.3 billion they got in Q3.
Got a source for that 4.7B?
Thanks!
Why not public-domain those process flows and turn it over to academia or something so at least the next generation of engineers can study and learn from them?
With 22nm it is easy to see why. Nobody has a working finfet process, and opening this avenue to the academy would give competitors access to Intel solutions to problems they are yet to find.
On why they don't license 180nm or 130nm.... that's IDK.
They could have "reduced" the inventory with a 10% depreciation
wich is quite plausible given their recent low cost cpus dumping
that couldnt compensate for the yearly 1bn revenue shrink...
In fact, it improved. Intel inventory finished at 4.7 billion levels, still above the beginning of 2012 but still better than the 5.3 billion they got in Q3.
Yeah I'm talking about the older stuff, I'm not so silly or daft as to be thinking about more recent nodes in my post above on the topic.
They only achieved that by lowering output at their factories. My point was this-
1. Intel is pinning their hopes for reviving the PC market on Ultrabooks with ULV parts, and x86 tablets with ULV parts
2. ULV parts are a deep bin sort of the same dies which go into standard voltage parts
3. In order to ramp up production of ULV parts to meet the new market segments, they need to also ramp up production of their standard voltage parts
If demand for their ULV parts rises at the same time as demand for their standard voltage parts falls - which seems to be the case in today's market- they're in a real pickle.
Let's say 30% (random number off the top of my head) of their 2-core GT2 dies can run at low enough voltages to go into ultrabooks or tablets. So for every 3 extra ultrabook chips they manage to sell, they get 7 extra standard laptop chips to sell- which isn't the way they want to go.
Could someone with a better knowledge of the semicon industry give me a better picture of how this can work for them? Are my yield/binning estimates way, way off? IDC, I'm looking at you here
what about reverse engineering those old stuff with a electron microscope?
In that case they would have to take a charge, like AMD has done.