News Intel GPUs - Intel launches A580

Page 57 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mikk

Diamond Member
May 15, 2012
4,112
2,106
136
Intel never claimed the full lineup was shipping in Q1. They merely stated Arc would launch Q1. Intel usually releases stuff like this on mobile first. (products on new nodes, etc)


Yes seems like mobile first, there is a reason why mobile Arc was showcased instead of a desktop version. But even in this case it's a very limited Q1 launch.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
I didnt have time to test games today, but i did fired up "Japan in 8K" to check AV1 decode and power consumption.

BjoK61y.png


This would be an amazing HTPC GPU, why Intel? WHY?

Steve was right, there is a rom chip on the gpu pcb. I dont understand whats missing here.

Then i fired up Furmark and i realised something
ReIB7Kb.png


The GPU does a very short burst to 1500mhz, then it goes down to 1150mhz, it seems to be TDP limited at 10.6W board power (thats the only power reading i have) gpu temps goes to +74°C with that setting, and eventually clock starts dropping to 1100 due to temperatures afecting the power consumption. I added a small fan to drop temps to 64°C and i got it stable at 1200mhz.

I need to see what exactly happens in gaming but if the gpu is constanly going from 1500 to 1150, then again to 1500, this may explain the frametime issues Steve had, or maybe not, he would have realised what the problem was.

I really want to see if i can get rid of that power limit the gpu has, but i dont think there is a way. The reason for this limit is probably due to the passive cooling.
 

PingSpike

Lifer
Feb 25, 2004
21,729
559
126
This would be an amazing HTPC GPU, why Intel? WHY?

Steve was right, there is a rom chip on the gpu pcb. I dont understand whats missing here.

I assume you're referring to the OEM vendor lock that you have a workaround for.

I was quite disappointed with this and it kind of turned me off all of their coming discrete GPUs. What does this DG1 idle at?
 

mikk

Diamond Member
May 15, 2012
4,112
2,106
136
The GPU does a very short burst to 1500mhz, then it goes down to 1150mhz, it seems to be TDP limited at 10.6W board power (thats the only power reading i have) gpu temps goes to +74°C with that setting, and eventually clock starts dropping to 1100 due to temperatures afecting the power consumption. I added a small fan to drop temps to 64°C and i got it stable at 1200mhz.


This is a 80EU DG1 right? 10.6W usage at 1200 Mhz 80EU seems about right compared to my 1300 Mhz 96EU which requires roughly 15W in Furmark. So I think it's not the board power usage in GPUz, it's the GPU power. And board power is limited to 25W overall as we can see in the Hwinfo screen, this is the problem. It's a very low GPU power limit, every 28W TGL-U device can run full on its 1300-1350 Mhz iGPU turbo. No wonder it sometimes struggles so hard against Xe iGPU.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I assume you're referring to the OEM vendor lock that you have a workaround for.

I was quite disappointed with this and it kind of turned me off all of their coming discrete GPUs. What does this DG1 idle at?

Why would it be an issue with DG2? DG1 is a rushed project so they didn't have time to finish the vBIOS. DG2 will have it. Even if the ROM is on board, it doesn't mean it's fully functional. It has to use the motherboard BIOS.

It's not an intentional lock.

This is a 80EU DG1 right? 10.6W usage at 1200 Mhz 80EU seems about right compared to my 1300 Mhz 96EU which requires roughly 15W in Furmark. So I think it's not the board power usage in GPUz, it's the GPU power.

I agree with mikk. And 25W is probably the real board power. GPU-Z is OK, but it has flaws. HWInfo is the best.
 

PingSpike

Lifer
Feb 25, 2004
21,729
559
126
Why would it be an issue with DG2? DG1 is a rushed project so they didn't have time to finish the vBIOS. DG2 will have it. Even if the ROM is on board, it doesn't mean it's fully functional. It has to use the motherboard BIOS.

It's not an intentional lock.
Sounds like a pile of excuses to me. Intel actually has a history of locking features behind chipsets product levels, to the point they make their product offerings uncompelling. You and I may think that is an idiotic business decision but I've seen them do it enough times to believe it's part of their corporate culture at this point.

There used to be a ton of small time video card manufacturers, they managed to make their ROMs boot on different platforms fine despite having a fraction of resources that Intel has. Intel was in such a rush they couldn't get this working? If that is true, it doesn't sound like they are serious to me.

Has Intel explicitly said that DG2 will work on all platforms anywhere yet? Until I see that in unambiguous terms I will remain skeptical.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
There used to be a ton of small time video card manufacturers, they managed to make their ROMs boot on different platforms fine despite having a fraction of resources that Intel has. Intel was in such a rush they couldn't get this working? If that is true, it doesn't sound like they are serious to me.

It's not a lock just on an Intel platform, it's a lock on a specific Intel platform, which I doubt they can be that stupid.

despite having a fraction of resources that Intel has.

Small teams can be much more nimble. Work in a small business and you can talk to your boss every day. Endless meetings and teleconferences are big companies which creates delays. So if the entire company makes it a point then the resources matter - but their skunkworks projects will suffer.

The ship analogy is very on point. You have a fishing boat and you can change direction very easily. Try doing that on a freight ship, and it'll take forever. Where the ship is pointing to is an analogy to where their strategy is aimed at. Sure the big freight ship will absolutely demolish everything and have immense power. But takes forever to switch.

Making a proper vBIOS for a small manufacturer is a make-or-break business decision for them, so it makes sense that they put all their resources. But for a huge company not at all.

So despite having 100K employees the company mostly acts as a homogenous entity that can reliably point in a single direction(may look like multiple directions sometimes but everything is really aligned to one).

My point is that they were simply not ready. Clearly multiple evidence points towards the same thing. Coincidences like using same peculiar LPDDR4X memory type, the 25W limitation. Or Is it? Likely they literally took the GPU portion off Tigerlake and put minimum to say they have a "discrete GPU".

Firmware programming is one of the hardest things to do. If you really want to prove it, get your hands on the cards and extract the vBIOS from the ROM chip to see if there's anything useful.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
They are limiting this product to specific Intel motherboards and only of ASUS(for now), not JUST intel boards, thats why i really have a hard time beliving this is not a TGL silicon with CPU disabled. I just dont understand, they could be selling this card right now, why shoot yourself in the foot like this? They are waiting to launch the DG1 to retail after DG2? so it dosent create the bad impresion of Intel gpus = low end?

Whats worse, i see no reason why Asus couldnt just add the DG1 support to all their intel motherboards bios by default, instead you need to ask them for the specific model bios with DG1 support that is no diferent in any other way to the regular bios.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
They are limiting this product to specific Intel motherboards and only of ASUS(for now), not JUST intel boards, thats why i really have a hard time beliving this is not a TGL silicon with CPU disabled. I just dont understand, they could be selling this card right now, why shoot yourself in the foot like this?

LOL. That's horrible. Yes such confounding decisions can be explained by them simply not being ready for it, but there was pressure to release it.

The die is 95mm2, so it seems big enough that they made a new cut just without the CPU portion. So IPU, display, OPI, memory controller, all exist.

I suspect in late 2018 when they announced their intentions to enter the dGPU market, it was just that - an announcement. And they had absolutely nothing at that time. So work probably started at or shortly after that announcement. No wonder they couldn't release it this year. It can take over a year just for silicon and software verification! So when DG1 started shipping, they had just two years, assuming no delays, and no screwups.

Sounds like a pile of excuses to me. Intel actually has a history of locking features behind chipsets product levels, to the point they make their product offerings uncompelling.

The whole company didn't get this big by being complete morons. Locking features is done after a monopolistic position is established - not when you are entering a brand new one! By the way Nvidia does this as well. Oh, wait no it's not just in computers. All monopolies do this!

The work they do for the Linux open source drivers(and the community) is top notch. Also their WiFi and Networking works perfectly across all platforms, including AMD and super old Intel devices.

They could have made XeSS proprietary too, but nope they are going to do just as much if not more than AMD going against the proprietary DLSS method, because it's open.
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
They could have made XeSS proprietary too, but nope they are going to do just as much if not more than AMD going against the proprietary DLSS method, because it's open.

If you enter a market you need to be open. Just look at MS. Why did they change from 100% proprietary to more open? Because else they would have been pushed out of the market eventually. But once a company dominates they will increase prices and will lock stuff down. And yes this also applies to AMD. Just look at Ryzen pricing.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Ok, first updated values for power

idle
gGuhEM9.png


VP9 4K
G0WEFBQ.png


AV1 8K
kX5ds01.png


As for gaming, gaming clock is, in general 1300mhz, but as temperatures rises near 80°C that tends to go lower to 1200mhz. The GPU is capped at 25W TDP no way around that. Using a fan is possible to stabilise it at 1300/1350mhz, but i didnt do that because thats cheating.
There is a general issue with lows, you may be at 35-40 fps then suddently 23 and right back to normal. It is very noticeable. Very likely these are the frametime consistency issues Steve mentioned.
DX11/DX9 and OGL are fine, at least with the games i tested, it seems to be a general light/texture glitch present on both Vulkan and DX12.

The glitch:
It is clear on RDR2, specially using High/Ultra textures.
IqSRsCr.png

SmFMF3q.png

323hCK5.png



Fortnite 1080p medium
3qVNuvD.png

Since this is a competitive game, In order to actually play this you need to use performance mode or a mix of low, med and high settings. It is playable at 1080P.
Texture/Light glitch present on DX12 mode.

GTA V 1080p @ High (FXAA)
nNXv07x.png


Warzone 1080P @ Low
UvQ7dc9.png

This is not playable at all at 1080p, need to drop render resolution to 1600x900.

RDR2 1600x900 @ Low + High textures
FD38fRx.png

bbSDUl7.png


SOTTR 1080P @ LOW DX11
BnRwKzQ.png

This is playable at 900p, as a side note, i was able to run the DX12 benchmark once, and went out to change settings and it never ran again, it continusly crashed at the logo, i dont think Intel is to blame for this one, it was crashing the moment it should display the Epic launcher overlay for the first time.

Witcher 3 1080P @ High no hairworks
voLz7ZH.png

m7gPWYY.png

DOvfKGn.png


Then i decided to murder it with Freespace 2 Open to see if the OGL driver was fine. It was.
FS2Open 1080P
g7T1ul1.png

lBuV9no.png

HGTVxvc.png

DtsHEXV.png

zqXusHw.png


It was working fine if nothing too crazy was going on... this is also the first example were i saw it going over the 4GB VRAM and actually use the system ram as well.

This is a GT1030 DDR5 level GPU, its not bad, but is also below Cezanne APUs and Vega is able to archive that sharing the bandwidth with the CPU. So this GPU would sell for sure, but is not really not something i would present as "my first GPU", so im thinking if Intel plan is to first release DG2 and then come back and do the retail launch of the DG1.
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
As for gaming, gaming clock is, in general 1300mhz, but as temperatures rises near 80°C that tends to go lower to 1200mhz. The GPU is capped at 25W TDP no way around that. Using a fan is possible to stabilise it at 1300/1350mhz, but i didnt do that because thats cheating.
There is a general issue with lows, you may be at 35-40 fps then suddently 23 and right back to normal. It is very noticeable. Very likely these are the frametime consistency issues Steve mentioned.

Very interesting, thanks for the work.

Intel said the iGPU version can be better because the dGPU traverses through the slower PCI Express link and also has higher latency.

So if we assume direct port of the Tigerlake iGPU, the times where it screeches to a halt may be times where the iGPU has no issue due to the faster communication but does on the DG1. Unless they want developers to patch it in, fixing it in hardware will be likely the only issue as the drivers will only go so far.

With the low VRAM cards PCI Express makes a difference: https://www.pcgameshardware.de/Rade...pecials/PCI-Express-3-vs-PCI-E-4-GPU-1339415/

You can see the PCI Express 3.0 version is significantly worse in frametimes.

This is a GT1030 DDR5 level GPU, its not bad, but is also below Cezanne APUs and Vega is able to archive that sharing the bandwidth with the CPU.

At good enough prices it would have sold for older systems and it has latest media support. They definitely missed an opportunity.
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Fascinating. What do you surmise is the reason? PCI Express 4.0 offers lower latency?

Just bandwidth. On the 4GB cards it can run out of VRAM so it uses the PCI Express link to address system memory.

On higher end cards generally it has much more VRAM so it doesn't use the PCIe lane as much. Although if you go too low it'll lower performance noticeably. Like PCIe x8 3.0 is fine for all the non RAM starved cards.

It probably will benefit these cards going to 5.0. But in the particular 5500XT's case, it's limited to an x8 link so if they released a version with x16 link it'll be faster virtue of that. 4.0 x16 will make it even faster but as you can see it won't be faster than having 8GB VRAM which the review shows.

In Intel's case DG2 value version will probably be reworked to account for the fact that it's a discrete GPU. Lot of the problems people attribute to drivers on an Intel GPU was often due to hardware. I don't know, probably larger caches will help with the stuttering.

A successor to DG1 using HPG with 50% better performance and frametime problems fixed on a PCB as small as possible will probably sell decently. I know media center folks and SFF builders will go nuts for it.

I will be getting myself a mid-range DG2 just to own it and it'll probably come useful for future repairs. I think over long periods of time it might also be worth something lol.
 
Jul 27, 2020
15,749
9,815
106
I will be getting myself a mid-range DG2 just to own it and it'll probably come useful for future repairs. I think over long periods of time it might also be worth something lol.
Great Tip! It could fetch a great price maybe 20 years from now from a collector of old hardware for history's sake.
 
Jul 27, 2020
15,749
9,815
106
It's interesting how old hardware gets a new lease of life when paired with newer mobos sporting the latest tech. There was some review showing how gameplay got smoother with a modest FPS boost for old 3D cards with simply swapping the HDD for an SSD. And that was just SATA 3.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
The DG1 seems to support PCI-E 4.0 btw.

So it's mostly working fine. Could have made for a decent low end GPU option with GDDR5.

Nah, LPDDR4X bandwidth is not the issue here, you can notice when the card drops from 1500 to 1300, it has enoght bandwidth it needs a higher clock and TDP room. 68GB/s is a lot for a gpu this small, thats a lot more of what the GT1030 DDR5 or a Vega IGP has.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
As for mining no miner works, not a suprise really, only lolminer detects the DG1 as a valid OpenCL platform, but selecting it to mine or benchmark of any algo ends in a "unsupported device or driver". Other miners wont detect the gpu.

The DG2 will eventually launch and all miners and hiveos will add support for it.
 
Last edited: