News Intel GPUs - Intel launches A580

Page 51 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,625
5,895
146
Guess it depends on how much slower Navi 24 is versus the 6600. It does look pretty crippled so it might actually be that slow.
Bit of an understatement given it's 16CUs vs 28CUs you're comparing there.
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
Navi 24 is probably around 1650 Super to GTX 1660. GTX1650 non-super is probably 12CU RMB territory.

DG2-128 and Navi 24 are likely to go toe to toe.
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,705
4,549
136
Navi 24 is probably around 1650 Super to GTX 1660. GTX1650 non-super is probably 12CU RMB territory.

DG2-128 and Navi 24 are likely to go toe to toe.
N24 has 64 bit bus, with Infinity cache. ALU count, is the same as DG2.

128 EU DG2 has 96 bit bus, instead of 64, has unusually large L2 cache for only 1024 ALUs, and has 32 ROPs.

RDNA2 doesn't really have higher IPC than RDNA1 or Turing. So it will be very interesing to see how they both will fair against.

Im giving slight edge to DG2, since it will not be limited by 4 GB frame buffer.
 

Ajay

Lifer
Jan 8, 2001
15,431
7,849
136
DG2-512 absolutely will beat the 6600.

DG2-128 won't come close.
Why - we have no idea how performant the drivers will be. It could be the same issue as with ATi years ago - better hardware delivering lower FPS. Hardware capabilities are only 1/2 the equation.
 
  • Like
Reactions: ryan20fun

blckgrffn

Diamond Member
May 1, 2003
9,123
3,058
136
www.teamjuchems.com
Why - we have no idea how performant the drivers will be. It could be the same issue as with ATi years ago - better hardware delivering lower FPS. Hardware capabilities are only 1/2 the equation.

Oh, fond memories of installing different drivers for different games back in my 8500 days, ha. So much value in the hardware, and as a college student I had so much more time :D

I am really interested in how it plays out IRL. Can't wait for the reviews and the the threads here :)
 

uzzi38

Platinum Member
Oct 16, 2019
2,625
5,895
146
Why - we have no idea how performant the drivers will be. It could be the same issue as with ATi years ago - better hardware delivering lower FPS. Hardware capabilities are only 1/2 the equation.
The 6600 is slightly lower performance than a 3060.

You genuinely think DG2-512 is going to be even weaker than that? Because I certainly don't.
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
Lets remember that drivers are far less important these days than it was in the ATI days with just high level APIs.

DX11 performance will be 100% Intel responsability, but Vulkan and DX12 is more of a 40/60 with the developers. With the devs being the 60.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,058
136
www.teamjuchems.com
Lets remember that drivers are far less important these days than it was in the ATI days with just high level APIs.

DX11 performance will be 100% Intel responsability, but Vulkan and DX12 is more of a 40/60 with the developers. With the devs being the 60.

Ha, is that supposed to make us feel more or less optimistic?

I'd say that seems like it's even worse because no matter how many cards Intel has they will be drops in the sea in terms of currently in use market share meaning devs have little reason (outside of being directly paid by Intel) to bother with optimizations...
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
Lets remember that drivers are far less important these days than it was in the ATI days with just high level APIs.

DX11 performance will be 100% Intel responsability, but Vulkan and DX12 is more of a 40/60 with the developers. With the devs being the 60.

Except the developers will be optimizing their DX12 code for Nvidia first, AMD second, and Intel a distant third (if at all). They've got a serious uphill battle. They can't just optimize their drivers now, they need to convince every major game developer to care about performance on Intel.
 

Ajay

Lifer
Jan 8, 2001
15,431
7,849
136
The 6600 is slightly lower performance than a 3060.

You genuinely think DG2-512 is going to be even weaker than that? Because I certainly don't.

I hope not, because that would mean Intel really screwed up - but I can't discount the fact that they may, in fact, screw up.

Lets remember that drivers are far less important these days than it was in the ATI days with just high level APIs.

DX11 performance will be 100% Intel responsability, but Vulkan and DX12 is more of a 40/60 with the developers. With the devs being the 60.

Depends, Devs can choose to use APIs that are closer to the metal - which can reap performance benefits, or hose them.
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
DX11 was left to rot and DX12/Vulkan have these issues that should be resolved on the game side... thats the problem of using these kind of APIs. We already have games that were designed/optimised for the first gen of DX12 GPUs that are likely to start having issues with newer hardware after a significant arch change.

GPU vendors can try to workaround it on the driver side, but that end up adding overhead. Its the price to pay for having low level apis in this kind of enviroment.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,058
136
www.teamjuchems.com
Intel is more than capable of putting money hats on engine developers' heads.

Capable and willing are two different things, especially if the shareholders also have their hands out.

It seems to me like that would also take a couple years to really filter down to games. It seems like they are held together by duct tape, bubblegum and force of will long enough to launch and maybe sell a DLC or two.

If Intel hadn’t entered and then exited a number of markets (including GPUs way back when the dinosaurs roamed even) after failing to make inroads after a year or two I would feel better about this.

Clearly they will be making “APUs” for the foreseeable future, so I guess if in the long run all we get is better supported integrated graphics that’s not really a bad thing.
 

Leeea

Diamond Member
Apr 3, 2020
3,617
5,363
136
DX11 was left to rot and DX12/Vulkan have these issues that should be resolved on the game side... thats the problem of using these kind of APIs. We already have games that were designed/optimised for the first gen of DX12 GPUs that are likely to start having issues with newer hardware after a significant arch change.

GPU vendors can try to workaround it on the driver side, but that end up adding overhead. Its the price to pay for having low level apis in this kind of enviroment.
You are just wrong.

Directx 12 games work across Nvidia, AMD, and Intel's* historical product stack, including ancient out of support hardware like HD7000s, GTX600, and Haswell* generation iGPUs ( Iris Pro and Intel HD ).

Believe me, the developers are not testing all that ancient hardware.


*although, horrifyingly with Intel they intentionally nerfed the drivers for DX12:

It seems after Jan 20th 2020 they stopped caring if users of their older iGPU's had working DX12, and removed it from the feature set. Exactly the kind of support we do not want to see in a potential GPU provider.

Anyway, Intel has made DX12 run just fine in the past with their iGPUs. This is not a developer problem, not a DX12 problem, it is just an Intel problem.
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
You are just wrong.

Directx 12 games work across Nvidia, AMD, and Intel's* historical product stack, including ancient out of support hardware like HD7000s, GTX600, and Haswell* generation iGPUs ( Iris Pro and Intel HD ).

Believe me, the developers are not testing all that ancient hardware.


*although, horrifyingly with Intel they intentionally nerfed the drivers for DX12:

It seems after Jan 20th 2020 they stopped caring if users of their older iGPU's had working DX12, and removed it from the feature set. Exactly the kind of support we do not want to see in a potential GPU provider.

Anyway, Intel has made DX12 run just fine in the past with their iGPUs. This is not a developer problem, not a DX12 problem, it is just an Intel problem.

Sure, but how WELL did those games run? Were they hobbled by weird performance issues, because the developers didn't optimise for Intel?

I know Riot optimise specifically for Intel integrated graphics, but most games don't.
 

Leeea

Diamond Member
Apr 3, 2020
3,617
5,363
136
Sure, but how WELL did those games run? Were they hobbled by weird performance issues, because the developers didn't optimise for Intel?

I know Riot optimise specifically for Intel integrated graphics, but most games don't.
Thing is, developers typically optimize for one brand. The brand that sponsored their game.

Nvidia sponsored games optimize for Nvidia*. AMD just has to make it work.
AMD sponsored games optimize for AMD. Nvidia just makes it work.
Intel, sponsors nothing and assumes it will work. It usually does work, poorly. That is an Intel problem.

Intel is used to everyone optimizing around its CPUs for free, but that is not going to happen in the GPU segment. Game Ready drivers exist for a reason. AMD and Nvidia are optimizing their drivers for each individual game, something Intel has not done. This is the reason games just work on AMD/Nvidia.


*Nvidia sponsored games are also known for implementing features designed to sabotage the competition. Hairworks, hardware required physX**, etc. Pretty much the opposite of optimizing for all platforms. AMD makes it work anyway. Intel has far more resources then AMD. Intel just needs to make it work.

**this is not ancient history, Metro Exodus for example is optimized for Nvidia hardware physX, hairworks, and Nvidia's specific rtx raytracing extensions.
 
Last edited:
  • Like
Reactions: Tlh97

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Driver based performance gains aren't usually so great that a 3070 Ti will end up performing like a non-Ti 3060.

It usually means there will be some titles that perform quite a bit lower than others. Meaning rather than being similar to the 3070 Ti average across most titles, some might end up faster than 3070 Ti and some will end up lower than 3060 to provide the same average.

What good drivers provide is consistency. When it comes to average, the Iris Xe doesn't do half bad. It's in consistency where it has an issue. We want a product that performs consistently well across a wide range of titles.
 

KompuKare

Golden Member
Jul 28, 2009
1,014
925
136
Except the developers will be optimizing their DX12 code for Nvidia first, AMD second, and Intel a distant third (if at all). They've got a serious uphill battle. They can't just optimize their drivers now, they need to convince every major game developer to care about performance on Intel.
You'd think since everything is developed console-first, that at least engine devs would actually spend more time concentrating on there console.
And that this should trickle down to AMD optimising by default.
Otherwise all those low margins and hugh share of TSMC 7nm wafers, is doing AMD more harm than good.

Which might the case anyhow. 10 million sold Radeon 6700XT's instead of all those PS5's would mean developers could not afford to not optimise fo r AMD
 
Feb 4, 2009
34,554
15,766
136
I donno, intel has money. They have relationships with vendors, they have historically paid vendors to optimize stuff for their chips.
Certainly intel knows what needs to be done to have game developers optimize for their cards.
These cards aren’t crappy throwaway integrated parts. I am confident the goal on laptop graphics is to keep it as cheap as possible but still display the maximum screen resolution.
Comparing an add on card that people are going to spend money on and a integrated chip that is there because it needs to be there isn’t a fair comparison imo.