. What they couldnt do the last ~10 years with much bigger R&D budgets will suddenly happen in 2016/2017?
Things have changed a lot in CPU design since 2005...
http://www.synopsys.com/home.aspx
Pay this site a visit.
. What they couldnt do the last ~10 years with much bigger R&D budgets will suddenly happen in 2016/2017?
Things have changed a lot in CPU design since 2005...
http://www.synopsys.com/home.aspx
Pay this site a visit.
So you say all the other companies that increase their R&D budgets, unlike AMD who cuts it is wrong. Just because you found something on a site in relation to efficiency improvement?
Riiiiiiiight.
And your point with that wild-ass-guess-with-no-evidence-to-back-it-up-statement was exactly what?
Do you also have benchmarks of AMD's Zen 14 nm server CPUs?
Server CPUs are where the big profit margins are. If Zen is a competitive architecture, then focusing on regaining lost server market share first makes sense - especially if yields at 14nm are low at first. In my opinion, this is actually good news - it means that AMD is optimistic about Zen's performance.
They increase their spending because they lack the necessary IP..
How much money is necessary to get the IP and expertise that AMD has currently on GPUs and CPUs.?.
Just look at Intel that did throw billions in GPU RD just to end with barely 50% perf/Watt of AMD s IGPs despite those latter being a node late.
Did you seriously just imply that Intel lacks expertise in CPUs that AMD has?
Did you seriously just imply that Intel lacks expertise in CPUs that AMD has?
That is what he did.
Yet i was clear on my post, what i said is that Intel has not AMD s expertise in matter of GPUs, you surely noticed that i didnt mention any specific firm CPU wise but i said that AMD has superior IP in this domain compared to most firms.
No argument there on either point. Thanks for the clarification.
Who has a better GPU tech than AMD.?..
CPU wise who has as much IP than AMD set apart Intel.?
Could you point me another firm that is capable of designing and validating X86 CPUs set apart Intel.?.
Perhaps I wasn't clear: I said that I agree with you and then thanked you for clarifying your statement, which I had originally misunderstood![]()
Ok, sorry for my misunderstanding of your misunderstanding..![]()
Could you point me another firm that is capable of designing and validating X86 CPUs set apart Intel.?.
Technically, VIA can also do it -- but I don't think they do it as well as AMD. It is a little funny that people complain about FX performance -- Maybe you should try gaming on a Via QuadCore Isaiah chip? I think even the Intel fanboys would appreciate AMD hardware a little more if they ran a Via CPU for a week.
(Disclaimer: I do own a jetway motherboard with a Via Nano.... So I'm not hating on Via.) But an FX 8350 is an absolute rocket ship compared to any CPU that Via has manufactured. Via does have a decent niche in embedded, though.
Unless AMD found a way to make their R&D engineers to work 15% better, they will have to either scrap or scale back something, and guess what, there isn't much they can scrap right now.
ARM wasnt so tiny before the smartphone boom. Remember they was in all previous phones as well and plenty of other devices. Not sure how anyone can compare that to AMD.
So to sum it up: ARM went from $1.2B market cap by the end of 2008 to $16.05B as of today (check the link in my previous post).
And ARM managed to capture more or less the complete mobile phone and tablet segment. A segment that Intel has been trying desperately for years to enter without succeeding, and is currently losing over $4B a year in(!). That's more than 3x the total market cap of ARM in 2008, that Intel is losing per year only in that segment!!
Now please tell us how this is possible, if the size of the R&D budget is all that matters?
Look to be legacy X86 486DX like CPUs, wich is no more covered by patents, but this limit the instruction set to what was available 20 years ago, seems that most advanced are limited to MMX, and 32 bit, for this reason.
