News Intel GPUs - Intel launches A580

Page 58 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jul 27, 2020
15,739
9,809
106
As for mining no miner works, not a suprise really, only lolminer detects the DG1 as a valid OpenCL platform, but selecting it to mine or benchmark of any algo ends in a "unsupported device or driver". Other miners wont detect the gpu.

The DG2 will eventually launch and all miners and hiveos will add support for it.
Yes but Compubench is using some blockchain hashing benchmark representative of typical mining workloads. That could give at least some idea of Intel's compute capability for mining purposes.
 

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
When are we likely to hear more about DG2? Computex, or before? Just curious - not following the Intel ARC rumor mill.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Nah, LPDDR4X bandwidth is not the issue here, you can notice when the card drops from 1500 to 1300, it has enoght bandwidth it needs a higher clock and TDP room. 68GB/s is a lot for a gpu this small, thats a lot more of what the GT1030 DDR5 or a Vega IGP has.

Are you saying the frametime increases and stuttering is due to clock fluctuations?

1.5 to 1.3GHz is a small difference in the big scheme of things. It's only 15%. If it's due to clock it seems like a bigger deal, like a power mangement maturity problem.

3p0QoqS.png

It says on that GPU-Z screenshot it has 8 ROPs and 16 TMUs. TPU's database says 24/48 for 96EU and 20/40 for 80EU.

The base Iris Xe G7 architecture is 24/48, so either it's 24/48 or even the 20/40 figure is correct for the 80EU DG1.

GPU-Z doesn't go detect hardware features. The numbers are manually inserted by the coders. I know I had to correct the ROP/TMU count for earlier generations and they changed it, but they reverted to the erroneous figure.

There was some review showing how gameplay got smoother with a modest FPS boost for old 3D cards with simply swapping the HDD for an SSD. And that was just SATA 3.

And that's the truth. I used to have a G965(GMA X3000, the basis of modern Intel GPUs with execution units) but paired with Celeron D because I wanted to save money before I got a Core 2 Duo.

Since the performance was highly dependent on the CPU for that integrated graphics, I only got about 20 fps in the game World of Warcraft.

Yet I played it running 5 man dungeons and even raid instances. Because the X25-M SSD kept the framerate stable, which the WD Raptor drive couldn't.

I can tell you beyond a decent SATA SSD, it doesn't benefit performance. The NVMe throughput is rarely ever achieved and pretty much a marketing number.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Are you saying the frametime increases and stuttering is due to clock fluctuations?

1.5 to 1.3GHz is a small difference in the big scheme of things. It's only 15%. If it's due to clock it seems like a bigger deal, like a power mangement maturity problem.

From what i observed, its not the clock, but 200mhz does in fact, have a significant impact in fps, for example, in Witcher 3 it was sitting at 37-38, then it dropped to 33-34 when clock went down to 1300, it may not seem like much but when you are so close to 30... It is similar to the gains on a 3400G form going 1400 to 1600.
The drops are, again, from what i observed, mostly when you are moving the camera, it really seems to have a problem of moving stuff out of render and add new things that wasnt there before, one would think "the VRAM is not enoght and it is moving stuff from system ram" yeaaah, normally, thats the reason for those drops, but we are talking about using 2GB VRAM or maybe even less and that still happens. Shader compile at runtime also crossed my mind here, that would also cause it. But i see no reason why that would be diferent with a Intel GPU.

Anyway, it looks to me that a lot can be gained by changing the tdp limit and overcloking it (and add a fan because the cooler is barely enoght as it is), but there is no software support for that.
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
From what i observed, its not the clock, but 200mhz does in fact, have a significant impact in fps, for example,

Yes but according to GN, the drops are way more severe. It's like half the minimum fps compared to the Iris Xe, when the average fps is often higher.

The drops are, again, from what i observed, mostly when you are moving the camera, it really seems to have a problem of moving stuff out of render and add new things that wasnt there before, one would think "the VRAM is not enoght and it is moving stuff from system ram" yeaaah, normally, thats the reason for those drops, but we are talking about using 2GB VRAM or maybe even less and that still happens.

So even in GPUs there's the concept of latency. VRAM is lower bandwidth and higher latency, but useful for large code like textures. But for small code, I don't know like instructions, caches will be much faster in bandwidth and especially latency.

Based on what Intel is saying, I surmise the iGPU can be faster because it can communicate to the CPU at ringbus speeds which is lightning fast. They will need to work to do whatever to minimize the impact of communication that would now use the slower PCIe slot.

That's an example. An iGPU design works differently so a straight port to a dGPU would not be optimal.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
Looking good, now it all depends on drivers and efficiency.

Agree more with @NTMBK . We need to see how it actually handles common rendering tasks. But yeah drivers are going to be a major factor here. The only thing TUM_APISAK's info indicates is that it's a capable compute card.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,761
777
126
16GB card with 3070ti performance would be nice indeed. For the right price of course. None of this crappy 8GB vram for me thanks.
 

jpiniero

Lifer
Oct 1, 2010
14,509
5,159
136
16GB card with 3070ti performance would be nice indeed. For the right price of course. None of this crappy 8GB vram for me thanks.

Given that Intel is making an mining product, I'd be very surprised if it's not on the 3070's level on mining performance. Which means a grand real price.
 

Grabo

Senior member
Apr 5, 2005
240
40
91
Given that Intel is making an mining product, I'd be very surprised if it's not on the 3070's level on mining performance. Which means a grand real price.

Indeed. It makes sense that it should arrive with around the 3070's performance, price and availability, since the market is the opposite of saturated and has been so for a long time.
 

mikk

Diamond Member
May 15, 2012
4,111
2,105
136
2.1 Ghz is roughly the expected clock speed from DG2-512. MLID said 2.2 Ghz is the target for DG2-512.

According to MLID desktop DG2 is coming in Q2, only laptop DG2 could launch in March.
 

Tup3x

Senior member
Dec 31, 2016
944
925
136
Knowing Intel, those GPUs will be cheap.

Intel is a company that always targets mainstream consumer.
They could even sell them at loss to gain market share and visibility. I wouldn't be surprised if they have some rather "interesting deals" for OEMs (for systems that use their CPUs)...
 

gdansk

Golden Member
Feb 8, 2011
1,973
2,353
136
They won't be selling them at a loss. In present market conditions... are you kidding? They can make a healthy profit and undercut Nvidia/AMD street prices easily.

Of course it will be some time before the GPU group recoups it's R&D expenses. So in that sense it will be at a loss.
 

ultimatebob

Lifer
Jul 1, 2001
25,135
2,445
126
They won't be selling them at a loss. In present market conditions... are you kidding? They can make a healthy profit and undercut Nvidia/AMD street prices easily.

Of course it will be some time before the GPU group recoups it's R&D expenses. So in that sense it will be at a loss.

Actually, it looks like Intel is going to be late to the party. The crypto bubble is currently bursting, and it looks like Intel is going to be releasing their new cards right around the same time miners will be panic selling their newish GeForce 30X0 and Radeon RX 6X00 cards on eBay.
 
  • Like
Reactions: Tlh97 and Leeea
Jul 27, 2020
15,739
9,809
106
Actually, it looks like Intel is going to be late to the party. The crypto bubble is currently bursting, and it looks like Intel is going to be releasing their new cards right around the same time miners will be panic selling their newish GeForce 30X0 and Radeon RX 6X00 cards on eBay.
Oh, how I love the sweet sound of that!
 
  • Like
Reactions: Tlh97 and Leeea

Glo.

Diamond Member
Apr 25, 2015
5,657
4,409
136
Actually, it looks like Intel is going to be late to the party. The crypto bubble is currently bursting, and it looks like Intel is going to be releasing their new cards right around the same time miners will be panic selling their newish GeForce 30X0 and Radeon RX 6X00 cards on eBay.
Don't get your hopes up about Crypto bursting, yet ;).

Overall, yes, Intel will be late to the party. I don't expect them to be selling their products at a loss, but I fully expect that Intel will offer bundles, CPU+GPU, and potentially Motherboards.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
Actually, it looks like Intel is going to be late to the party. The crypto bubble is currently bursting, and it looks like Intel is going to be releasing their new cards right around the same time miners will be panic selling their newish GeForce 30X0 and Radeon RX 6X00 cards on eBay.

Sounds like wishful thinking to me.