• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Do you think Intel could destroy AMD in making video cards?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Gen line will continue and there is plenty of goodness to come beyond Gen9; we have been working on Gen10 for nearly a year and Gen11 is in definition ... and I can't say anything about those either 🙂

:awe:

Does that mean that 10nm won't be another Tick+ like IB was, since it seems to me that only 1 year isn't very much as Cannonlake will be released in 2016?

How do you think your new Gen8/9/10 architectures will compete against the competition, like Maxwell, if you ignore process differences? Do you think you can also achieve a 2X efficiency increase? Do you think you will ever leapfrog Nvidia's architecture just like Core is the best CPU architecture 🙂?
 
I don't think it is wise or useful for me to give an answer - If I give a pessimistic answer, people will use that as a hammer to say "even Intel doesn't believe they have good graphics architecture"; if I give an optimistic answer people will jump on me now saying they don't believe me; and if it later turns out that some game or benchmark doesn't do well they will drag up this thread to say they told me I was full of it.

Aggravating that is the fact that it isn't really possible to normalize for process, tdp design points, memory bandwidth (including impacts of caches like LLC), ...
 
I don't think it is wise or useful for me to give an answer - If I give a pessimistic answer, people will use that as a hammer to say "even Intel doesn't believe they have good graphics architecture"; if I give an optimistic answer people will jump on me now saying they don't believe me; and if it later turns out that some game or benchmark doesn't do well they will drag up this thread to say they told me I was full of it.

Aggravating that is the fact that it isn't really possible to normalize for process, tdp design points, memory bandwidth (including impacts of caches like LLC), ...

Not to mention that you would be breaking NDA. But a very wise answer nonetheless.
 
Mantle needs to either A) Improve faster (being an open standard it may be able to do so) or B) get a higher adoption rate in games.

AMD have said DX12 will not obsolete Mantle because being hardware specific, Mantle's first goal was to reduce CPU overhead, it will be improved to enhance GPU rendering.

Now, whether they can accomplish that is another matter.
 
They will never do it if they can't make a GPU that actually clocks up. For the life of me I can't get my 4600 to clock up in certain programs. I'm getting better fps in furmark than in minecraft. :\
No driver setting, nothing in windows power settings and the official reps on the official forum have no idea.

I mean, I'm sure it'll be an option eventually, but long waits for drivers is part of the reason people don't like AMD.
 
Last edited:
I don't think it is wise or useful for me to give an answer - If I give a pessimistic answer, people will use that as a hammer to say "even Intel doesn't believe they have good graphics architecture"; if I give an optimistic answer people will jump on me now saying they don't believe me; and if it later turns out that some game or benchmark doesn't do well they will drag up this thread to say they told me I was full of it.

Aggravating that is the fact that it isn't really possible to normalize for process, tdp design points, memory bandwidth (including impacts of caches like LLC), ...

What about a realistic answer? I've posted some questions in the hope you'd pick one of them up, 'cause you probably aren't allowed to disclose everything, but if you're confident about something, why not?
 
What about a realistic answer? I've posted some questions in the hope you'd pick one of them up, 'cause you probably aren't allowed to disclose everything, but if you're confident about something, why not?

I honestly think your lucky he even responded to such a question.

There are issues on so many levels.

No company would want their employees running off the mouth when it comes to future products and designs that are in development. Surely you can piece together at least one reason why.

And then to top it off,
Its on the internet.

Have you heard of JFAMD?
 
Not sure Intel will do discrete gaming cards in future but they definitely need AI cards (preferably on their own foundries) to reclaim lost glory
 
intel can beat everybody, they just allow others to survive
To quote Thanos - Should have gone for the head.
Intel is getting close to wiping out mid range GPUs with their IGPUs, they don't really need a discrete card entry at this point. Maybe if Nvidia and AMD start to fail because Intel has taken away 80% of their markets then Intel will capitalize on that for an easy win with a single high end discrete project (targeted more at mobile but also available for desktop).
"Chipzilla will trample all others under foot!" This was a common perspective just a few years ago. Seems like just yesterday, the geniuses of Reddit were proclaiming Intel's very deep pockets would ensure great success in the discreet GPU market.
I think the entry would be too costly for what the profit is. If the annual profit pie is say 2 billion and now its divided between AMD and Nvidia, if Intel comes in it would be divided 3 ways, highest would likely still be Nvidia, then AMD and Intel is likely to have small 5-10% share of the market.
While Slick could not foresee the crypto and AI booms, he was damned close to nailing it with AMD and Intel's combined market share. Most impressive of the predictions I have read so far.
Intel is already gaining 5%+ graphics share a year already. And nVidia today is as big as AMD with GPU+APUs.

There is no reason for Intel to buy anyone. The result will be the same.
How it started ^

How it's going -

200.webp
 
Back
Top