Do you think Intel could destroy AMD in making video cards?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Gen line will continue and there is plenty of goodness to come beyond Gen9; we have been working on Gen10 for nearly a year and Gen11 is in definition ... and I can't say anything about those either :)

:awe:

Does that mean that 10nm won't be another Tick+ like IB was, since it seems to me that only 1 year isn't very much as Cannonlake will be released in 2016?

How do you think your new Gen8/9/10 architectures will compete against the competition, like Maxwell, if you ignore process differences? Do you think you can also achieve a 2X efficiency increase? Do you think you will ever leapfrog Nvidia's architecture just like Core is the best CPU architecture :)?
 

rootheday3

Member
Sep 5, 2013
44
0
66
I don't think it is wise or useful for me to give an answer - If I give a pessimistic answer, people will use that as a hammer to say "even Intel doesn't believe they have good graphics architecture"; if I give an optimistic answer people will jump on me now saying they don't believe me; and if it later turns out that some game or benchmark doesn't do well they will drag up this thread to say they told me I was full of it.

Aggravating that is the fact that it isn't really possible to normalize for process, tdp design points, memory bandwidth (including impacts of caches like LLC), ...
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I don't think it is wise or useful for me to give an answer - If I give a pessimistic answer, people will use that as a hammer to say "even Intel doesn't believe they have good graphics architecture"; if I give an optimistic answer people will jump on me now saying they don't believe me; and if it later turns out that some game or benchmark doesn't do well they will drag up this thread to say they told me I was full of it.

Aggravating that is the fact that it isn't really possible to normalize for process, tdp design points, memory bandwidth (including impacts of caches like LLC), ...

Not to mention that you would be breaking NDA. But a very wise answer nonetheless.
 
Feb 19, 2009
10,457
10
76
Mantle needs to either A) Improve faster (being an open standard it may be able to do so) or B) get a higher adoption rate in games.

AMD have said DX12 will not obsolete Mantle because being hardware specific, Mantle's first goal was to reduce CPU overhead, it will be improved to enhance GPU rendering.

Now, whether they can accomplish that is another matter.
 

Bubbleawsome

Diamond Member
Apr 14, 2013
4,834
1,204
146
They will never do it if they can't make a GPU that actually clocks up. For the life of me I can't get my 4600 to clock up in certain programs. I'm getting better fps in furmark than in minecraft. :\
No driver setting, nothing in windows power settings and the official reps on the official forum have no idea.

I mean, I'm sure it'll be an option eventually, but long waits for drivers is part of the reason people don't like AMD.
 
Last edited:

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
I don't think it is wise or useful for me to give an answer - If I give a pessimistic answer, people will use that as a hammer to say "even Intel doesn't believe they have good graphics architecture"; if I give an optimistic answer people will jump on me now saying they don't believe me; and if it later turns out that some game or benchmark doesn't do well they will drag up this thread to say they told me I was full of it.

Aggravating that is the fact that it isn't really possible to normalize for process, tdp design points, memory bandwidth (including impacts of caches like LLC), ...

What about a realistic answer? I've posted some questions in the hope you'd pick one of them up, 'cause you probably aren't allowed to disclose everything, but if you're confident about something, why not?
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
What about a realistic answer? I've posted some questions in the hope you'd pick one of them up, 'cause you probably aren't allowed to disclose everything, but if you're confident about something, why not?

I honestly think your lucky he even responded to such a question.

There are issues on so many levels.

No company would want their employees running off the mouth when it comes to future products and designs that are in development. Surely you can piece together at least one reason why.

And then to top it off,
Its on the internet.

Have you heard of JFAMD?
 

marees

Golden Member
Apr 28, 2024
1,470
2,057
96
Not sure Intel will do discrete gaming cards in future but they definitely need AI cards (preferably on their own foundries) to reclaim lost glory