News Intel GPUs - Intel launches A580

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
All but dead but produces 419 million in revenue which is probably more than AMD's entire graphic division revenue.

http://files.shareholder.com/downlo...9BC-4CBF-8987-4679CF3FAA71/CFO_Commentary.pdf

Tegra is one of their fastest growing businesses at the moment.

They have of course mostly pushed it sideways into Automotive stuff:
https://www.anandtech.com/show/10596/hot-chips-2016-nvidia-discloses-tegra-parker-details
https://www.anandtech.com/show/1187...orrt-3-announcement-at-gtc-china-2017-keynote

Although the higher target TDP's in cars should logically mean they're quite well suited for use in games consoles & stuff too.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
Just for fun.. :D
I had to do it

e6EPtkO.png
 

turtile

Senior member
Aug 19, 2014
614
294
136
Well, it would be good if Intel could get in the game with a competitive card.

The competition is needed, imo.

I think we will see their cards much faster than 4 or 5 years, though.

I doubt we will see Intel focused on gaming. They will likely focus on the server market only.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
All but dead but produces 419 million in revenue which is probably more than AMD's entire graphic division revenue.

http://files.shareholder.com/downlo...9BC-4CBF-8987-4679CF3FAA71/CFO_Commentary.pdf

Tegra is one of their fastest growing businesses at the moment.

Tegra was NVidias attempt to get into mobile. It failed completely. I thought you were talking about chromebooks and similar?

Tegra business is essentially all Switch and NVidia Shield, using a chip from 2015. It grew because the Switch was a hit, and it didn't exist until recently. That and bit of automotive work.

But Tegra failed in it's attempt to get into phones/tablets.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
Actually it leaves NVidia without a dance partner in laptop GPUs, and the increasing trend is that people are mostly buying laptops.

Both Intel and AMD will be have better and better "Good Enough" laptop GPUs integrated in the CPU Die. NVidia is locked out at this level.

Both AMD and Intel will also have dGPUs, that will be put in special packages tightly integrated with something like EMIB. NVidia will have a hard time integrating it's dGPUs at this level so NVidia will Mostly be locked out at this level of integration.

Intel may have packaged up an EMIB commercial proof of concept, using a Radeon dGPU chip but you can bet that future integrated designs from Intel will be all Intel once they get their dGPU ready, and AMD will be packaging all AMD solutions.

Where does that leave NVidia? Mostly on the outside without a CPU partner for laptops.

NVidia dGPUs will pushed to the niche hardcore Gamer Laptops, that run GTX 1070+

So the real loser here is NVidia, not AMD.


How much longer does Intel contractually have to support PCI-E? Once they stop supporting that as an external GPU interface, and switch entirely to EMIB-packaged on-package GPUs, NVidia is toast.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
mabe, with windows soon to be running on ARM chips/Nvidia Tegra in laptops, Intel/AMD will need something to counter it? Just guessing.

I mean a new superfast 7 watt tegra chip in a laptop running windows 10, sounds good to me.


edit: I also agree with the Mobile chip theory, Nvidia has a massive lead and Intel/AMD wants a piece of that pie also.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
The only lead that Nvidia has in mobile is their dGPU. They have compelling products, but few new wins. Switch is a big deal, but it's a closed ecosystem. They need to push more on Chromebooks (at the expenses of Rockchip) and get majority in bed with MS and their next attempted ARM port of windows.
 
Mar 11, 2004
23,031
5,495
146
Sorry if this was actually stated/explained, but is Intel and AMD possibly co-developing GPUs? Seems really bizarre that they announce the partnership on the one chip (along with a licensing deal? that's how it seems was touted is more about a licensing setup than a single specific product), and then Intel hires the head of AMD's graphics (with seemingly no anti-compete clause?). Seems to me that licensing RTG GPUs and then doing a co-development deal would serve them well. It lets them get up to speed on GPU faster, it gives AMD more marketshare (helping to speed software development). Most of all it lets them both take on Nvidia. With both offering compatibility for HPC/compute/etc, where going Intel or AMD CPU gets you tight integration with GPU. Intel would still have advantages (Optane for instance). AMD keeps tight integration on APUs, Intel less reliant on Nvidia. Meanwhile, each can tweak GPUs as they see fit moving forward (maybe Intel focuses more on the pro stuff, or they kinda pick particular niches), while still keeping some compatibility. Short term it helps with Nvidia's marketshare (both in consumer for AMD GPUs, but also in pro where Intel and AMD both could use leverage against Nvidia; potentially in mobile too, where I could see Intel pairing an updated Atom or maybe even just doing ARM or maybe bit of each with a mobile focused GPU based on RTG and then an integrated Intel modem).

Oh, and Intel gets to manufacture GPUs on their process, potentially giving them an advantage (maybe they get higher clocks or certain other features, maybe some certified external GPU setups for laptops), while AMD gets flexibility (possibly Intel maybe even a foundry partner?).

To me, it would seem to make a lot of shorter term sense as they're competing but not stepping on each other too much as they both are fairly entrenched in certain markets/segments, and then they can see how things go. But I'm not sure there's much actually supporting any of that?

mabe, with windows soon to be running on ARM chips/Nvidia Tegra in laptops, Intel/AMD will need something to counter it? Just guessing.

I mean a new superfast 7 watt tegra chip in a laptop running windows 10, sounds good to me.


edit: I also agree with the Mobile chip theory, Nvidia has a massive lead and Intel/AMD wants a piece of that pie also.

Is that possible at this point? What I mean is that Microsoft is moving to that, but I thought they said it requires extensive bridging of code or special compiling, so they basically focus on a specific chip (meaning it won't open it up to all ARM designs, even when they follow the ARM instruction set), which I believe is the Snapdragon (835?) to start with. Sure they can probably expand, but I wouldn't say there's a guarantee that Microsoft will put in the work, and I think Intel already threw a fit about Microsoft and Qualcomm doing even that (which is I think why things have grown pretty quiet with regards to that), so I doubt they'd be pleased with Nvidia adding to that.

But its been awhile since I've seen anything on that, so maybe things changed.

I'm not sure about massive lead. Intel is making big progress on modems which alone was enough to give Qualcomm near monopoly in the more premium segment (even when Apple started doing their own SoC, they still used Qualcomm modems). And Intel is making other changes (and part of their big new GPU push is for IoTs and mobile). Plus Nvidia's custom designs have had serious issues, and they haven't updated their standard ARM based ones in what 2-3 years? Either they've got problems with their custom CPUs again, or they can't make a business case for them beyond specific niches, which indicates the issues are much bigger than just them making a chip.

The only lead that Nvidia has in mobile is their dGPU. They have compelling products, but few new wins. Switch is a big deal, but it's a closed ecosystem. They need to push more on Chromebooks (at the expenses of Rockchip) and get majority in bed with MS and their next attempted ARM port of windows.

Nvidia has also rubbed a lot of companies the wrong way. Nintendo was about the only major tech/gadget company that had not worked with (and subsequently had a sour experience with) Nvidia. That seems to be a good fit now, but we'll see, as both are notoriously guarded about licensing (read $$$) issues, which is what usually soured companies on working with Nvidia (who wanted premium pricing for what they claimed was premium hardware, which early Tegras really did not live up to). I think that was actually more or less just an issue of convenience. Nvidia was the only company that had an ARM design with a particularly strong GPU that they could purchase, and they also had tools ready to help developers (a notorious Nintendo problem, one which definitely hampered the Wii U). AMD had stalled or cancelled their ARM chip, and Jaguar even on the latest process likely wouldn't have been able to offer the battery life that ARM does.

Some of the companies have worked with Nvidia again, even somewhat agreeably (Microsoft most notably after they were mad about the Xbox but then used Tegra in Zune HD and in Surface non-Pros; although that's likely because they maintained the Windows/PC hardware relationship; will be interesting to see if how the Intel/AMD thing works out, as Surface would seem to be a good fit for that hybrid Intel CPU/AMD GPU, but it'll be almost another year before they update it and by then things could change). Sony was mad after the PS3 (where they basically turned to Nvidia at the last moment when it became clear Cell couldn't outdo a GPU for game rendering; and then felt that Nvidia wasn't as willing to deal on pricing). Apple seems to have really soured on Nvidia, I think a large part was "bumpgate" which hurt Apple's rep and cost them money. They had used them since, but don't any more even when Nvidia has put in software work to try and win Apple support and they would seem to have an advantage over AMD in laptops with their better perf/W. I seem to recall Samsung being bitter about I think Tegra 2 or 3 that was in one of the S phones. And in spite of Nvidia specifically touting design wins, few products with Tegra actually materialized.

I'm not sure what's up with Tegra at this point. Nvidia was touting custom cores (and acting like they had caught Apple), but haven't seen those chips come out (other than maybe some premium cars where I'm not sure they're actually doing what Nvidia claims yet). And they've talked mostly about automotive since (which they also lost Tesla deals to Intel and AMD, so I'm not sure that's going all well for them either). Not sure if they made progress on their LTE modem (which was another Tegra thing Nvidia hyped, only it seemed to be a dud). Seems like they need to focus, either roll with ARM CPU to try and get compatibility, and play their GPU and software support of the GPU, or focus on making Tegra more premium (only it has to actually live up to it). Or they need to be willing to take less margins in order to build some good will. But considering we saw Intel try to do that almost to the extreme, and it was a total failure.
 
  • Like
Reactions: KompuKare

xpea

Senior member
Feb 14, 2014
429
135
116
It is important to understand that the actual softwares are designed to these limits. But I can write stuff where the PCIe 3.0@x16 won't be enough. And I'm totally sure you would like that. :)

In the future, this will change when a GPU will connect to the CPU with a ~100-150 GB/s low-latency interconnect, and with direct access to the CPU page table. This is what AMD and Intel pursue. And you can bet they don't care about that NVIDIA only able to follow them with ARM. :)
Just one question. Have you ever said something positive about Nvidia ?
Because as much is your knowledge, it becomes irrelevant when you only describe doom and fail to the leader of the GPU industry.
Otherwise, you can get all the bandwidth you want if the calculation is magnitude slower than the competition, its useless. And up to now Intel could not compete with Nvidia in the HPC space, generation after generation...
 

Yotsugi

Golden Member
Oct 16, 2017
1,029
487
106
And up to now Intel could not compete with Nvidia in the HPC space, generation after generation...
Top one supercomputer is totally using nVi-
Top two supercomputer is totally using nVi-
Well you got the drill.
 

PingSpike

Lifer
Feb 25, 2004
21,729
559
126
PCIe is used for a lot of things other than graphics. Stop with the FUD.

This isn't really total FUD. A big part of the reason Nvidia was even making motherboard chipsets for Intel and the big piece Nvidia got out of the lawsuit (where they agreed to stop building chipsets) was extracting a requirement to prevent Intel from shutting them out.

SoC systems are often starved of lanes or in some cases they don't even have any. I remember Nvidia building an add on device for maybe netbooks that they were forced to make work for a 1x lane because that's all Intel provided on the platform. I can't recall what it was called but I believe I read about it here on AT. So there were already some signals back then Intel was thinking of just pulling the carpet right out from under them. It would hurt Nvidia and AMD after all.

If they weren't legally prevented from doing so, I could see Intel having starved out the lanes or thrown up a different kind of roadblock to add-on cards on their lower end systems particularly during the bulldozer era when everyone would probably have just taken it. Something simple, you install a pentium in your motherboard, well, it has no cpu lanes because those come off the cpu. I bet those super expensive edram iGPUs would have actually sold a lot better if there was no way to install an Nvidia GT1030 in a laptop. "Why would anyone need pci-e lanes in a laptop" I can hear Intel saying. "It saves power to remove that".
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Ignoring AMD, especially when they are on the rise again, is plain stupidity.

AMD have APUs, have bigger engineering capabilities in current state of delivering hardware(SoC's) for those purposes than anyone in the industry. Nvidia does not have x86 CPUs, Intel and AMD has.

Who said anything about ignoring AMD?

Just because Intel has a renewed focus on GPUs doesn't automatically mean that they will stop developing any other products.

And AMD having APUs and experience in SoCs really doesn't matter as it simply isn't something that threatens Intel. The simple truth is that AMD's APUs are addressing a relatively tiny market. Think about, the whole proposition of APUs is that you can replicate a small dedicated GPU in a smaller more efficient package, but the part of the consumer market that even cares about having a small dedicated GPU in their laptop in the first place is tiny (there's a reason why Intel is currently sitting on something like an 80%+ market share here). If AMD can actually figure out how to put some more significant GPUs in their APUs, then it might be relevant, but there is no indication that this is happening anytime soon.

The real threat when it comes to APUs and SoC is, and has been for a long time, Apple. AMD gobbling up a few design wins with gamer laptops is nowhere near as big a threat as Apple replacing Intel in the macbook lineup with ARM. And with Microsoft building Windows on ARM as well, Intel is probably afraid that this could be seen a signal by the rest of the industry that ARM is good enough (even though the ARM products available to the rest of the Industry is pretty far behind what Apple has to play with). This could quickly turn into a nightmare for Intel.

And Nvidia not having x86 CPUs is largely irrelevant when they are perfectly capable of showing immense growth in the datacenter segment without it.

Its also funny how Raja from incompetent fool running Radeon Technologies Group suddenly morphed into star of GPU engineering by joining Intel.

AMD and Nvidia have 4-5 years of "breathe". Any work that Raja will do at Intel will come to fruitin with next generation architeture at Intel. At that time span - A LOT can change for both of those companies.

Raja himself is largely irrelevant in all of this (he is only 1 guy after all), what's important is what he represents, and that is a renewed focus on GPUs by Intel. As noted in the AT article, we don't know how long Intel has been at this, so it is perfectly possible that they will have product out before that 4-5 year timeframe.
 

Yotsugi

Golden Member
Oct 16, 2017
1,029
487
106
If AMD can actually figure out how to put some more significant GPUs in their APUs, then it might be relevant, but there is no indication that this is happening anytime soon.
They already do that, console semi-custom designs are merely beefy APUs.
The real threat when it comes to APUs and SoC is, and has been for a long time, Apple. AMD gobbling up a few design wins with gamer laptops is nowhere near as big a threat as Apple replacing Intel in the macbook lineup with ARM. And with Microsoft building Windows on ARM as well, Intel is probably afraid that this could be seen a signal by the rest of the industry that ARM is good enough (even though the ARM products available to the rest of the Industry is pretty far behind what Apple has to play with). This could quickly turn into a nightmare for Intel.
ARM is not going to do anything even to 15W market.
And Nvidia not having x86 CPUs is largely irrelevant when they are perfectly capable of showing immense growth in the datacenter segment without it.
Riding the meme learning boom. It's a bubble that will eventually pop.
 
Last edited:
  • Like
Reactions: Tlh97

IndyColtsFan

Lifer
Sep 22, 2007
33,656
687
126
I had an i740 GPU back in the day for a second or third gaming system, and I don't recall it being terrible but it certainly was no match for competitors at the time. It seemed to me that Intel gave up on it too soon.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
I had an i740 GPU back in the day for a second or third gaming system, and I don't recall it being terrible but it certainly was no match for competitors at the time. It seemed to me that Intel gave up on it too soon.

As a discrete part, certainly. But between poor performance, and the various architecture quirks, it was not successful.

But the core (in i752/754 form) was reused for the Extreme Graphics IGP. Which in turn spawned the more modern Intel IGPs. So it was quite successful in that regard.

The "Extreme" Graphics were horrible BTW... ;)
 

IndyColtsFan

Lifer
Sep 22, 2007
33,656
687
126
As a discrete part, certainly. But between poor performance, and the various architecture quirks, it was not successful.

But the core (in i752/754 form) was reused for the Extreme Graphics IGP. Which in turn spawned the more modern Intel IGPs. So it was quite successful in that regard.

The "Extreme" Graphics were horrible BTW... ;)

Well, I mean, I guess that was my point - the i740 obviously wasn't successful and it seems to me they should've stayed in the game longer in the discrete market. I'm not really sure they're going to be able to compete with NVidia or even AMD at this point.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
How many other consumer applications of it need a full X16 Gen3 link? Sound is in the chipset, WiFi is getting integrated into the chipset... Leave an X4 link for storage, couple of X1 links for miscellania, and you're done.

USB, Ethernet, storage, video capture to name a few.
 

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
USB, Ethernet, storage, video capture to name a few.

None of those need full x16 (apart from some high end video capture, which I would not count as consumer). I already mentioned storage, USB is built into the chipset, and ethernet falls under the "miscellania" which can be serviced with an x1 or x2 link.