News Intel GPUs - Intel launches A580

Page 23 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

PingSpike

Lifer
Feb 25, 2004
21,729
559
126

This rumor is interesting, but seems unlikely. I don't think AdoredTV is reliable and why would a new costumer use 7nm and not 6nm in 2022? Thoughts?

I don't believe it. Intel will scrap their discrete GPUs entirely before they outsource production. And I don't think TSMC would even give them a good deal anyway since they're half a competitor. The samsung rumor like this one from awhile back was already unbelievable and this is even worse.

The whole thing is pretty funny though. Wasn't Intel trying to open up their own fabs to other companies recently?
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,784
136
This rumor is interesting, but seems unlikely. I don't think AdoredTV is reliable and why would a new costumer use 7nm and not 6nm in 2022? Thoughts?

7nm wafers might be cheaper/more plentiful. Otherwise, it wouldn't make much sense. 6nm is a bit of a dead-end though (as is TSMC 7nm, really). It would make more sense to use 7nm+.

Wasn't Intel trying to open up their own fabs to other companies recently?

Yes. I think Rockchip took them up on it for a short while. Or was it someone else? Ehhh I don't remember anymore. Anyway, not many would want to use Intel's fabs at this point. Especially not with the wafer shortage.
 

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
I don't believe it. But if it does turn out to be true, TSMC 7nm in 2022 is going to be pretty damn uncompetitive.
 

RetroZombie

Senior member
Nov 5, 2019
464
386
96
Makes no sense.
What will happen to their stock market value? It will plummet.

The company who have the best process in the world and the best foundries can't manufacture it's own products, it have to resort to third-party, it makes all it's foundry business value nothing.

And many will start questioning, so all the huge investment in their own stuff for no returns ever?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Fanlesstech is saying we'll see Tigerlake NUCs later this year. The key point is the dGPU version is going to use 3rd party GPUs.

That significantly reduces the chance of seeing Intel dGPUs being released this year. Otherwise why wouldn't they just use their own?
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Fanlesstech is saying we'll see Tigerlake NUCs later this year. The key point is the dGPU version is going to use 3rd party GPUs.

That significantly reduces the chance of seeing Intel dGPUs being released this year. Otherwise why wouldn't they just use their own?

From my experience with the Nucs (besides that short-lived Hades Canyon thing) they're focused on being tiny and low power/heat. Limited and low clocked CPUs at best, and the current IGPs are more than enough for basically all NUC customers. Xe I think is going for higher targets and AI/Distributed computing. Basically it fills a gap in their product stack so they don't have to sell Nvidia parts for big University/Research projects (ideally, though Nvidia has a HUGE lead in this area over Intel and AMD).
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
For gamers, DG1 means Dated Graphics 1 and there will be no DG2 for us. It's sad and not what I wanted, but it's what I always expected. Getting people hyped for something that will never come and will suck if it does show up, just like Vega. Having Raja's name associated with any GPU makes me expect disappointment now after seeing him lay next to a PC with a Vega GPU inside it before release as if expecting to make love to the thing and then having it end up being a pile of crap. No confidence. My only hope is with Jacket Man and can only be with Jacket Man, regardless of the hilarious prices. At least that man delivers.
 

Hitman928

Diamond Member
Apr 15, 2012
5,177
7,628
136
This is interesting...
Looks Xe does geometry well.

I mean, well is relative. Yeah, it performs better against a 15W APU, but can you really call a brand new discrete graphics card performing a bit better than a 15 W APU performing well? It's obviously still too early for any conclusions and we don't know what power it was actually running at, but compared to the supposed tigerlake sample in the same comparison, it's significantly more than 15 W. Like I said though, still a bit early.
 

Tup3x

Senior member
Dec 31, 2016
944
925
136
I mean, well is relative. Yeah, it performs better against a 15W APU, but can you really call a brand new discrete graphics card performing a bit better than a 15 W APU performing well? It's obviously still too early for any conclusions and we don't know what power it was actually running at, but compared to the supposed tigerlake sample in the same comparison, it's significantly more than 15 W. Like I said though, still a bit early.
It has same config as in integrated form. Obviously we don't know if it's faster than that or meant to emulate it (we don't know memory config and power limits). Still, doesn't look too bad considering that the drivers are far from being optimised and likely just for debugging purposes at this point.
 

Hitman928

Diamond Member
Apr 15, 2012
5,177
7,628
136
It has same config as in integrated form. Obviously we don't know if it's faster than that or meant to emulate it (we don't know memory config and power limits). Still, doesn't look too bad considering that the drivers are far from being optimised and likely just for debugging purposes at this point.

The DG1 sample is performing 59% faster than the Tigerlake sample in the 15 W APU. Clearly it's being clocked significantly higher and probably using faster memory. Having a significantly higher core clock and (probably) using faster memory will cause the power use to be much, much higher than what the APU is using at 15 W. If you look at DG1's best graphics sub-test, it has a 42.3% lead over the Ryzen 4800u, but then if you look at the Tigerlake APU, the Ryzen 4800u is 49.4% faster than the Tigerlake APU, and this is still Vega in the Ryzen APU, not even Navi yet.

So how much power is the DG1 sample using? Obviously we don't know but it's safe to say that it is no where near 15 W. That's why I said performing well is relative when you realize the DG1 sample is taking so much more power than the 4800u it's being compared against. When you look at an APU in the same power envelop, it gets crushed by the 4800u.

I agree it's too early to make any conclusions, which I already mentioned, and that significant gains could be had with further BIOS tweaks and driver developments, but for now we can just react to what we have and these early results don't aren't looking good for DG1, even if the underlying architecture is a step forward for Intel. Hopefully for their GPU team, their driver state is incredibly crude and they are leaving the hardware's potential extremely untapped up to this point.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
I predict Intel GPU will fail miserably.
I doubt they will fail miserably. They are going to gain a substantial amount of market share pretty quickly, IMO. I'm guessing Dell, HP, Cyberpower, etc are all foaming at the mouth waiting for Intel deals that will allow them to put a 1080p60 card into their prebuilts, because I'm guessing they're assuming (probably rightly) that they will pay double-digit percentages less than they would for a 1650 or 5500. It's going to take market share away from Nvidia and AMD, for the best reason: cheaper competition.

I could care less if the Xe at 1080p60 is actually better than a 1650 or 5500XT. What I care about is whether it will force Nvidia and AMD to alter their pricing. If it does, I think we all win.
 

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
I could care less if the Xe at 1080p60 is actually better than a 1650 or 5500XT. What I care about is whether it will force Nvidia and AMD to alter their pricing. If it does, I think we all win

Well if intel only competes in lowest performance brack, then NV and AMD only need to adjust that market and keep their duopoly going in anything better performing.
 
  • Like
Reactions: Tlh97

senseamp

Lifer
Feb 5, 2006
35,783
6,187
126
Intel investors expect 60% gross margins, so don't expect pricing miracles, especially with their high end CPU cash cows under attack from AMD.
 
  • Like
Reactions: Tlh97 and NTMBK

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Intel investors expect 60% gross margins
Funny that the efficiencies of the market are coming back to haunt companies. Watching corporations churn and burn CEOs whose compensation is tied to stock price is part-hilarious and part-scary.

60% gross margins aren't sustainable in a competitive marketplace.