News Intel GPUs - Intel launches A580

Page 36 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Intel are building their GPUs on the same TSMC production lines as AMD. These aren't going to make a difference to chip shortages.

It's not exactly like that. Some part is dynamic based on supply but there will be a fixed part as well.

The availability will be a function of how much they secured from TSMC for production versus how much the market desires the said product.

If hypothetically Intel secures the same volume as Nvidia/AMD I can see them having a problem selling the cards. Inevitably as a new player even in the times of skewed supply/demand, people will be more wary of it.

Looking at how lack of discussions around Intel graphics parts even if they secure lot less volume there still might be good supply of the cards.
 

blckgrffn

Diamond Member
May 1, 2003
9,127
3,069
136
www.teamjuchems.com
It's not exactly like that. Some part is dynamic based on supply but there will be a fixed part as well.

The availability will be a function of how much they secured from TSMC for production versus how much the market desires the said product.

If hypothetically Intel secures the same volume as Nvidia/AMD I can see them having a problem selling the cards. Inevitably as a new player even in the times of skewed supply/demand, people will be more wary of it.

Looking at how lack of discussions around Intel graphics parts even if they secure lot less volume there still might be good supply of the cards.

Until we learn their hashrate! Haha! That's a joke that's... not a joke ;)
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,819
7,181
136
So long as intel remembers the old adage "There are no bad products, only bad prices" then they should do just fine as an unproven new player in a deeply entrenched duopoly.
 

blckgrffn

Diamond Member
May 1, 2003
9,127
3,069
136
www.teamjuchems.com
Only slightly better than previous gen. No challenge to AMD APUs, which have stagnated for years themselves.

Only the G7, with an actual payload of EU's and some closer to optimal memory configuration showed hope.

I'd say that hands on testing with a 3400G I was able to play many games @ 1080p with low details. The testing with "max" details at 1080 didn't really allow us to see if they could hack it at native resolution for many desktops and laptops. That said, I still really appreciated the info.

Ha, and seeing the GTX 950 be so usable! lol, I was buying those for $60 (heck, got PAIR for $70) about 9 to 18 months ago as "need a GPU, here's one that works" cards and many of the builds I kicked out into the wild are able to game I guess ;)

Too bad he didn't test with a GTX 960, right?!? :D
 

mikk

Diamond Member
May 15, 2012
4,141
2,154
136
Only slightly better than previous gen. No challenge to AMD APUs, which have stagnated for years themselves.


Over 50% in the fully GPU limited tests is slightly better than previous gen? iGPU is super important for system builders, there is certainly a market for cheap office and internet/media PCs and this is all they need. A bigger GT2 would be a waste of silicon for these desktop CPUs. I mean look at these AMD APUs with 3x more tflops, any half decent dedicated GPU will be vastly better than this. There is a reason why AMDs desktop APUs are a niche.
 

Tup3x

Senior member
Dec 31, 2016
965
951
136
Only slightly better than previous gen. No challenge to AMD APUs, which have stagnated for years themselves.
Mmmm....? The dedicated die area is smallest since Sandy Bridge, yet the performance is up. Desktop version is not meant for gaming at all anyway. They aren't aiming for awesome performance here. Just good enough for browsing and basic usage.
 

blckgrffn

Diamond Member
May 1, 2003
9,127
3,069
136
www.teamjuchems.com
Mmmm....? The dedicated die area is smallest since Sandy Bridge, yet the performance is up. Desktop version is not meant for gaming at all anyway. They aren't aiming for awesome performance here. Just good enough for browsing and basic usage.

Well… percentage wise… I mean, isn’t that only true because the cores are so much bigger? Per GPU transistor performance or something would be a better measurement. Percentage wise it’s about the same as the 10900k - if number of course matters then the percentages of all the four core parts should be scaled to per core numbers too? I just don’t think % is a great measuring stick of relevance even if it looks nice in that table.

It’s all a hack on 14nm so whatever. It’s functional. It’s just not APU class in Rocket Lake.
 

Panino Manino

Senior member
Jan 28, 2017
821
1,022
136
This new Intel GPU is good, but for iGP... like the proportion of the dies that this iGP uses. It's better than Vega? Yes, but Vega is very old with not that efficient memory compression. This thing will be obliterated by RDNA and can you imagine how good it'll perform on a 5nm APU?

What saves Intel is that, besides AMD only delivering the bare minimum to stay ahead, they have no reason to give us the best iGP they can fit on a APU, like Ian discussed in his article, for the people actually interested in using these APUs to play modern games the current ones are already good enough to make them satisfied.

It's hard to believe for people used to play with discrete graphics but I can confirm, player can be very happy with lower settings and frame rate.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Oh, and actual data relevant for dGPU, not iGPU.


The top SKU has 512EUs using 16GB 256-bit width GDDR6 with 1.8GHz Turbo clocks in a laptop configuration.

Over 50% in the fully GPU limited tests is slightly better than previous gen?

50% gain is great for a CPU, but mediocre for iGPU, when you are starting from so low.

You need at least 2x to make any noticeable difference.
 
  • Like
Reactions: Tlh97

jpiniero

Lifer
Oct 1, 2010
14,610
5,227
136
Oh, and actual data relevant for dGPU, not iGPU.


The top SKU has 512EUs using 16GB 256-bit width GDDR6 with 1.8GHz Turbo clocks in a laptop configuration.

The 512 and 384 EU models might be good enough to be profitable mining.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,846
3,190
126
Vaporware is pretty... that would make one hell of an interesting design seeing as i think those are 3 blowers.
But again, 10000000000000000% its vaporware pictures.

960x0.jpg
 
Last edited:
  • Like
Reactions: Tlh97 and NTMBK

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136

Glo.

Diamond Member
Apr 25, 2015
5,711
4,559
136
To be honest with you guys, I have not been caring about Intel GPU efforts, until I heard rumors about the prices of those GPUs.

Midrange for 300$? Im up for that!

Also, its interesting to see that 4096 ALU GPU, with 256 Bit GDDR6 VRAM and clocked at around 2000 MHz is going to perform around RX 6800/RTX 3070 Ti, GPUs which should have similar ALU counts.

If that will be maintained, and filter down the product stack, the 384 EU/192 bit chip might perform around RX 6700 XT. If rumors are correct, this GPU might cost 300$, which would be logical since its supposed to have 190 mm2 die size(roughly, according to VCZ calculations).

If Intel is able to deliver enough GPUs to the desktop, DIY market, I might pull the trigger, just out of curiosity.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
If Intel is able to deliver enough GPUs to the desktop, DIY market, I might pull the trigger, just out of curiosity.

Plus, you can't even run a single algorithm(despite dozens in Nicehash) of crypto mining on them. The support is absoutely bare, despite the OpenCL support.

If you have AMD Ryzen with Vega iGPU, they'll run.
 

Glo.

Diamond Member
Apr 25, 2015
5,711
4,559
136
Plus, you can't even run a single algorithm(despite dozens in Nicehash) of crypto mining on them. The support is absoutely bare, despite the OpenCL support.

If you have AMD Ryzen with Vega iGPU, they'll run.
Interesting.

Crossed fingers for good Linux drivers, tho(yes, I'm a Linux user!).
 

DrMrLordX

Lifer
Apr 27, 2000
21,637
10,855
136
If Intel is able to deliver enough GPUs to the desktop, DIY market, I might pull the trigger, just out of curiosity.

The opening is there. All Intel has to do is launch product in a timely fashion and in quantity. Easier said than done.

Plus, you can't even run a single algorithm(despite dozens in Nicehash) of crypto mining on them. The support is absoutely bare, despite the OpenCL support.

I've heard Intel's OpenCL support has been bad for years. Why do Intel iGPUs not run crypto algorithms?
 
  • Like
Reactions: Tlh97 and Glo.

Glo.

Diamond Member
Apr 25, 2015
5,711
4,559
136
I've heard Intel's OpenCL support has been bad for years. Why do Intel iGPUs not run crypto algorithms?
Intel's OpenCL support was good ONLY on MacOS. Linux and Windows - they were horrible.

For very long time Intel iGPU drivers for Windows were horrible as well...