• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Do you think Intel could destroy AMD in making video cards?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
disclaimer: I work at Intel in the graphics team, but my opinions are my own...

@ocre

You are correct that Larrabbee was overambitious and that for it to have succeeded it would have to have had top notch perf and perf/w running DX and OGL at competitive prices vs AMD/NVidia cards AND then offered the fully programmable rendering pipeline as a bonus. That was an unachievable dream.

However, your assertion that the Intel's progress in IGP is due infusion of NVidia IP is incorrect. Intel's IGP is 100% internally developed (I am talking about the "Gen" architecture used in Sandybridge, Ivybridge, Haswell, Baytrail, not the phone chips like Merrifield that use Imagination PowerVR).

Yes, Intel is making payments to NVidia but that is basically "protection" money - Intel develops its own GPUs and, as you say, it may independently reinvent something that NVidia already has patents for. To avoid having the risk of the $50B PC business held hostage to NVidia lawsuits all the time for patent infringement, Intel pays money to NVidia.

All of the improvement you have seen in Intel graphics in the last several years has been because Intel management continues to take graphics, media, and parallel computing very seriously and has made sizeable investments in headcount/R&D on HW and SW as well as the die area allocated to graphics.

While I can't disclose Intel roadmaps, you can expect to see sizable jumps in graphics performance at each power envelope in coming years.

<SNIP>


.
I am absolutely tickled that my post dragged you out of the woodwork. I appreciate your time and love to see post from those inside the companies.

I never meant to imply that Intel infused nvidia designs in their "Gen" IGP. And i completely support your statement that the intel IGP was built and designed totally in house. I do have to wonder about the deal just being "protection" money, as it does go against my understanding on the matter for sure. So I went looking for some information on this, which is very limited because nothing was released to the public. But i did find some comments that completely contradict your "protection" claims.

http://arstechnica.com/business/201...l-look-for-nvidia-gpu-on-intel-processor-die/
An NVIDIA spokesperson said, "Licensing a technology is different than incorporating an entire processor. The settlement provides Intel with access to our IP and patents, such as Sandy Bridge which already uses NVIDIA technology. The license enables Intel to extend that model for the next 6 years
This doesnt mean it is not an Intel design. They would have to implement their own way, in a vastly different environment. I think the "protection money" statement is putting it off so far as to try wipe out the notion but as my understanding i think you should at least be able to say the newest intel IGP was at least influenced by this technology you seem to shrug off.

The PCIe is another very important aspect that nvidia fought for but if this is all the deal was about, wouldnt you think Nvidia would be paying Intel? As this is really a benefit only to Nvidia yet we see a massive payment from intel. So, if intel really doesnt have any Nvidia influenced technologies in their IGP then what do you suppose will happen after this deal is over? Intel will not have a real reason to sign another, would they?

I am willing to wager that intel will absolutely work out another cross licenses deal once this is over.

Nvidia did need assurance that intel wouldnt cut out the PCIe. Nvidia was really really fearful of that. But there is more in the agreement than that.

I dont mean to me confrontational. I know the choice wording is difficult and easily misconstrued. semantics, perhaps.

I am short on time but am really glad that you took your time to post here. Please encourage some of your coworkers to spend time on forums talking to some of their biggest fans.
If you dont mind, what role do you play in the graphics team?

Best wishes
 
Last edited:
@ ocre:

I thought he already said that they are infringing on patents earlier, even though they developed them in house.

Intel develops its own GPUs and, as you say, it may independently reinvent something that NVidia already has patents for.
 
Last edited:
I did not say that Intel was infringing.. please don't put words in my mouth.

Nvidia PR and truth are only occasionally and accidentally aligned...

I said that signing a licensing agreement was a way to foreclose the sort of legal situation that Nvidia is now going after Samsung with... whether there is infringement or not (and whether the patents in question are even valid, non obvious, etc) is almost beside the point. The threat that some judge or jury could issue an injunction blocking all imports is sort of a "nuclear assured destruction" threat. Do you want to go through years of lawsuits and countersuits?

Or do you have both companies put their piles of patents on the table, weigh them; factor in who has the most to lose, who has deep pockets, and the prevailing legal winds (there were several major lawsuits in flight at the time and some were trending against Intel)... and then decide who should pay whom how much and what kinds of other terms you need to safeguard your interests and settle?

I am not a lawyer, but that sort of negotiations seems to be what happens in a lot of these licensing/patent battles. I claim no special insight on this one...
 
According to some here AMD would be doing extremely well.

I not sure of that.

On one hand you can say if they spent that 5 Billion or whatever the mount was on R&D instead of buying ATI. They may have been able to produce something to compete with conroe at time. Or they could have failed and wouldn't have been able to survive as long as they are with ati.

We will never know now but always something interesting to think about.

My gut tells me AMD still wouldn't have been able to prevent the streamroll that started at conroe to nehalem to SB to present day Haswell.

And it saddens me because the athlon is now a former shell of what it was.

As for GPU's not really worth it for intel currently. It would take too much money to try and catch up with AMD and Nvidia and it wouldn't meet the targets that intel requires for its profit margins.
 
Last edited:
I did not say that Intel was infringing.. please don't put words in my mouth.

Nvidia PR and truth are only occasionally and accidentally aligned...

I said that signing a licensing agreement was a way to foreclose the sort of legal situation that Nvidia is now going after Samsung with... whether there is infringement or not (and whether the patents in question are even valid, non obvious, etc) is almost beside the point. The threat that some judge or jury could issue an injunction blocking all imports is sort of a "nuclear assured destruction" threat. Do you want to go through years of lawsuits and countersuits?

Or do you have both companies put their piles of patents on the table, weigh them; factor in who has the most to lose, who has deep pockets, and the prevailing legal winds (there were several major lawsuits in flight at the time and some were trending against Intel)... and then decide who should pay whom how much and what kinds of other terms you need to safeguard your interests and settle?

I am not a lawyer, but that sort of negotiations seems to be what happens in a lot of these licensing/patent battles. I claim no special insight on this one...

So the whole thing is just in case nvidia might decide to sue us like they did samsung?

Thats just funny.

Nvidia was suing Intel but it was for the loss of their chipset and intel trying to knock them out of their core market. Dont you think that Nvidia would have been suing them for infringement back then? Why would Intel just start paying Nvidia billions of dollars when they could have just waited and settled the dispute if and when nvidia went after them for it. It really makes no sense for Intel just to pay this large sum, "just in case" nvidia might one day decided to go after them for infringement. The only way that makes sense is if they already knew there design was infringing in the first place.

Really, this stance you are taking seems really strange to me.
So what about the technology deals that intel had with ATI which all ended in 2006 forcing Intel to SiS? Was that all in good measure? Just in case?

Not so long after SiS, Intel is in cross licensing with Nvidia.... and its out of the blue? just in case.

yeah, Nvidia makes no denial of using intel patents. This deal covers a very broad stack and allows intel access to NVIDIA&#8217;s full range of patents. Nvidia also gets to use intel patents (which with tegra is great for them). I know your trying really really hard to minimize this to almost nothing.
Perhaps you will keep insisting, intel has been very very quiet about details in the deal. It is their right, of course. But i simply refuse the level your putting this off to. Intel has access to a full range of Nvidia patents and their IGP started taking off. I mean, before then they wasnt even up to date in DX. Surely Intel wasnt/isnt using any of those patents. It was all just in case
 
Last edited:
@ocre -

I have been working in the graphics team since 2006; Sandybridge was already in definition then and was part of the 10x by 2010. So the ramp of Intel graphics started well before the settlement.

I am in the software team and work daily for the last ~4 years with the hardware architects who are defining future products. We are working now on designs for 2017+. I can tell you that I have never seen any document or been in any conversation where someone said "and here we can use this handy patent or IP block from NVidia". Everything is developed in house.

You can believe me or not as you like; I am telling you the truth.

I don't know about stuff earlier than 2006 so i don't know exactly what you mean about ATI and SiS. As I recall there had been some issue in the early-mid 2000s where Intel had CPUs ready but couldn't manufacture enough chipsets or something and there had been a scramble to get other chipset suppliers. And of course, after Ati was bought by AMD everyone saw that it would be odd to expect them to be an ongoing chipset supplier for Intel CPUs.

Re NVidia and chipsets - I think they weren't happy but, as I said, once Moore's law drove integration if mch and GPU into the CPU complex, having graphics from NVidia in a pch wasn't ever going make sense technically. Are you saying NVidia would have been content to simply provide storage/networking etc pch functionality?

And going forward as features like manageability, content protection, et straddle CPU and chipset and ultimately integration into SoCs like Baytrail, having separate chipset suppliers stops making sense.

Given that it is pretty clear that any legal outcome that made Intel allow NVidia chipsets was not really a viable roadmap for either company.

The settlement makes this point clear - Intel has to offer PCIe on form factors where discrete GPU is a viable option - this ensures Intel doesn't cut NVidia out of GPU business on Intel platforms completely. But see the very reasonable exclusion for tablets, phones etc.

Why settle then? Well remember that NVidia was piling on top of various antitrust, etc claims made by the NY attorney general et al. There was enough "adverse momentum" from emails about exclusivity, etc that Intel had to take a defensive and conciliatory stance.

Again, I am not a lawyer but you can imagine NVidia saying "we have zillion patents on graphics; you are putting graphics in your CPUs and we can't sell our chipsets anymore.. I bet we could find some patent to sue you over ... And if you lose you won't be able to import/sell your CPUs in the US... Now how about you settle and pay us roughly what we were making per year in chipset revenue and we'll call it even." And the Intel lawyer says "ok, but I get to take the patent club out if your hand so you can't repeat the threat again next week to seek a higher payout - you will license me all your patents so I can't be accused of infringement."


Don't know if that is exactly how it played out but it doesn't seem far fetched to me.
 
Last edited:
I have been working in the graphics team since 2006; Sandybridge was already in definition then and was part of the 10x by 2010. So the ramp of Intel graphics started well before the settlement.

I am in the software team and work daily for the last ~4 years with the hardware architects who are defining future products. We are working now on designs for 2017+. I can tell you that I have never seen any document or been in any conversation where someone said "and here we can use this handy patent or IP block from NVidia". Everything is developed in house.

Very interesting to hear those comments. It must be interesting to work on designs meant for a 7nm node. I hope you keep them coming.
 
So the whole thing is just in case nvidia might decide to sue us like they did samsung?

Thats just funny.

Multi-billion dollar payouts from lawsuits are hardly funny. They are serious business; quite literallly.

And for the record, yes, cross-licensing agreements are nothing more than litigation cease-fires. They're very common in the industry. Nobody is actually using each other's IP (in the scope of this subject).
 
Thanks a lot rootheday3

I did enjoy your insight and your time. I think many people would enjoy it if you stuck around and joined in as much as possible. You can bring a unique perspective that many would love to read.

As for the details and the agreement, i only speculate. As much as you are closer to the matter, you speculate too..


Have heard that the deal made clear nvidia wouldnt pursue x86 cpus. Because this provision. i thought it was only natural that there would also be a counter to that and intel would agree to something similar: No dgpu for gaming. At least till the deal was up

So this is the main thing i was trying to contribute to the thread. It is only speculation and perhaps you know something of this. Or maybe not.
 
Last edited:
Multi-billion dollar payouts from lawsuits are hardly funny. They are serious business; quite literallly.

And for the record, yes, cross-licensing agreements are nothing more than litigation cease-fires. They're very common in the industry. Nobody is actually using each other's IP (in the scope of this subject).

Often times they settle when such lawsuits when they are brought against them. You know, deal with it when it happens.

But what doesnt make any sense at all about rootheda3y claims

Intel had been making igps for many years before the Sandy bridge IGP. a very long time but then all of a sudden, and without a pending lawsuit on them, they start paying nvidia a fat sum for cross licensing. Oh, and their IGP significantly becomes more relevant. Even DX compliant

Another issue is that Larrabee was intended for Haswell and I think even sandy in earlier roadmaps. Where, why, and how did this new "Gen" IGP pop up so fast?
 
Last edited:
During the Sandybridge time frame we started investing a lot more in software and hardware; when i started we had about 300 people working in graphics software vs about 1500 now... Including more product lines (including Atom, Core); more skus (gt1, 2, 3, 3e), windows and android plus Linux media server and virtualization, entry workstation. Way more APIs, performance, app compatibity, and even leadership features like QuickSync, PixelSync, dx12 joint engineering with Microsoft.

I personally led an effort in 2010 to address directX game compatibility and worked with SSG App engineer to ramp up our ISV engagement with game developers.

Haswell was supposed to have integrated Larrabbee; ivy bridge (gen7) was supposed go be the last evolution of the gen line. Haswell switch to gen7.5 was a last minute thing and didn't get much new features - dx11.1, scale up to gt3 and gt3e; connected standby for ultrabook skus, higher display resolutions including 4k... some fixes for sampler and non-promoted z cuz that all there was time for.

This meant a lot of scope to catch up for Gen8 as we got the gen roadmap revved back up- OCL 2.0, shared virtual memory, codecs, performance enhancements. Gen 9 adds a bunch more... Which I can't talk about
 
So intel had really put a massive effort in the gen igp while dumping so much into larrabee. Doubled down. Got much more serious about SW. How about the driver, did intel start doing it all in house by sandy bridge?
 
Gen Windows driver has been almost completely in house all along - some minor outsourcing to accelerate OpenGL feature implementation from time to time. Android gen OGL, media, Renderscript driver (eg for Baytrail tablets) is from same team.

There is a separate in house team in SSG that does open source "clean sheet" user mode OGL and media drivers for Linux ( but both teams collaborate on Linux kernel work to share between Android, Chrome, Linux).

In addition to the app engr that work with ISVs, SSG also has teams that develop tools like iGPA ( perf analyzer) and which do rendering research.

There is still a Larrabee sw team working on the Xeon Phi stack and tools.

Lastly, some Intel SoCs use 3rd party graphics IP (like Imagination PowerVR in Clovertrail, Moorfield, Merrifield). For these Intel gets a reference driver from the IP provider so you could say work is split between in house vs IP provider. Typically in house team has to do some work to integrate it with rest of SoC features in areas like display, dvfs, power well management, sleep/standby, Miracast, etc. In some cases the reference drivers don't work well at all and need major fixes or tuning. This was especially true for Clovertrail on Windows where PowerVR had no previous implementation and ref driver was no where close to passing WHQL.
 
dx12 joint engineering with Microsoft.
FYI, some people say that DX12 is MS's response to Mantle.

Haswell was supposed to have integrated Larrabbee; ivy bridge (gen7) was supposed go be the last evolution of the gen line.
Will the Gen line continue and has the Larrabee thing been discarded, or will we some some nice things with Cannonlake's Gen10?

Gen 9 adds a bunch more... Which I can't talk about
Great post 🙂.
 
I don't know exactly when Microsoft started on DX12; I know that we (Intel) had drivers ready to go for the GDC in March 2014 (3Dmark threading demo was DX12 driver on Haswell). We have been hard at work with Microsoft before and after that and ISVs on hammering out DX12 spec details and getting drivers ready for Haswell and beyond platforms to support upcoming OS milestones and ISV enabling.

Whether DX12 is a response to Mantle doesn't ultimately matter to me - what matters is that DX12 will be an industry wide standard, not a spec that belongs to one IHV.

Gen line will continue and there is plenty of goodness to come beyond Gen9; we have been working on Gen10 for nearly a year and Gen11 is in definition ... and I can't say anything about those either 🙂
 
IWhether DX12 is a response to Mantle doesn't ultimately matter to me - what matters is that DX12 will be an industry wide standard, not a spec that belongs to one IHV.
DirectX belongs to a single vendor. Mantle is going to be released as an open beta so its usage potential will be the same as DX. Also Mantle has the potential to work with other OSs this is not the case with DX it is tied forever to a MS OS.
 
DirectX belongs to a single vendor. Mantle is going to be released as an open beta so its usage potential will be the same as DX. Also Mantle has the potential to work with other OSs this is not the case with DX it is tied forever to a MS OS.

Is Mantle Open right now?
 
apus are the future. once it can run games in 1080p on med settings, it will be main stream for sure = market share.

most apus can run games in low settings in 1080p now right?

Naw, the bar keeps moving. By the time apus get pretty good at 1080p the mainstream will be split between 1440p and 2160p. RussianSensation has the right of it, intel could dominate this niche if they so chose but the cost/benefit analysis just looks brighter in other markets atm. If intel ever does get really serious about high gpu's then they'll buy ati from AMD or they'll figure out a way to convince jhh to sell NV to them at a reasonable price (ei, one that doesn't involve him running INTC).
 
No. There will be a public beta soon (ish?) released.

So no release date on Mantle? Mantle being open is relatively useless if DX12 hits first or soon after.

Actually, it's relatively useless period because Nvidia WONT adopt the standard. When DX12 hits why develop for Mantle +DX12 when DX12 does the same thing and is capable to be used on both vendors?

Mantle had a chance but IMO, it needed to move far quicker than it did to be a viable solution.

I really don't see it surviving past DX12's release date.
 
Actually, it's relatively useless period because Nvidia WONT adopt the standard. When DX12 hits why develop for Mantle +DX12 when DX12 does the same thing and is capable to be used on both vendors?
There is quite an impressive list of game devs that don't agree Mantle is useless. Also you're assuming two things, one DX12 will be as flexible as Mantle, and that DX12 will improve at the same rate as Mantle. Given Microsoft's glacial pace I'd say Mantle has a really good chance to improve at a much faster rate vs. DX.
I really don't see it surviving past DX12's release date.
Game devs DO see it surviving just fine or they would not waste any time with it.
 
There is quite an impressive list of game devs that don't agree Mantle is useless. Also you're assuming two things, one DX12 will be as flexible as Mantle, and that DX12 will improve at the same rate as Mantle. Given Microsoft's glacial pace I'd say Mantle has a really good chance to improve at a much faster rate vs. DX.

I mean I understand that but Mantle will still be AMD only. Nvidia won't adopt it. We both know this will be true. So DX12 can bring 95% of the performance of Mantle even and be fine. Even 90%. It'll still be the thing MOST devs use and Mantle will be more of the niche.

Mantle needs to either A) Improve faster (being an open standard it may be able to do so) or B) get a higher adoption rate in games.

I don't see Mantle getting into more games than Gameworks (Ugh), and I don't see Mantle being a huge performance benefit over DX12 that Nvidia insists on adopting it.

But that's just my opinion.
 
Maybe if Nvidia and AMD start to fail because Intel has taken away 80% of their markets then Intel will capitalize on that for an easy win with a single high end discrete project (targeted more at mobile but also available for desktop).

Huh? That doesn't even make sense. Are you saying that intel is so vast/dominant/scary/etc that they can design one chip to be the best from mobile all the way up to 4k? They can't even beat lower mid-range offerings from NV and AMD right now.

They already tried and failed. That experiment though lead to them creating better internal gpu's.

I think the entry would be too costly for what the profit is. If the annual profit pie is say 2 billion and now its divided between AMD and Nvidia, if Intel comes in it would be divided 3 ways, highest would likely still be Nvidia, then AMD and Intel is likely to have small 5-10% share of the market.

Not worth enough.

Depending upon whom you ask, Intel was pretty close to buying NV a few years ago. But jhh didn't want to sell, or at least he didn't want to retire to his own private island, so he demanded to be CEO of the "combined" company.

It would have been cheaper and, likely, smarter for Intel to just buy AMD instead of wasting all that time and money trying to reinvent the wheel. Or, in this case, reinvent the gpu haha.

I'm pretty shocked by the level of knowledge and quality information on this forum compared to others...*cough*tom*cough*.

Tom's was good a long time ago, but he took the money and ran so to speak instead of building his business the way Anand has over the years.

edit: ha, I've been away for a while. Did Anand sell the company when he went to go work for Apple, or did he just step down as CEO but retained his ownership?
 
Last edited:
Back
Top