News Intel GPUs - Intel launches A580

Page 64 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
In a way yes. But lots of their "driver issues" as perceived by people have been due to their hardware. The X3000 had an issue of having a weak geometry engine, but people expected driver magic to fix it. Yes, it took them forever to get hardware T&L support, which enhanced performance and compatibility in many games but unexpected lack of performance in many others was due to having a low performance geometry engine, not crappy drivers.

That, and software shaders. It wasn't until the later DX10 compatible Gen4 Intel GMA had hardware support for SM3.0/DX9c. Which was effectively the standard API then.
 

gdansk

Golden Member
Feb 8, 2011
1,973
2,353
136
To some extent I'm hoping Intel's software support team will prioritize media, machine learning and application specific compute over games anyway. Their design seems better for that than RDNA. And in those areas Nvidia really needs a competitor. RDNA is competitive enough at gaming.
 

KompuKare

Golden Member
Jul 28, 2009
1,012
923
136
Jensen sees Nvidia as a software company first. Source from over a decade ago:
More recently reiterated as "NVIDIA is a software-defined company today":
The irony being just around when that comment was made, I suffered the first of many Nvidia products dying on me due their poor choice of hardware.

As in packaging choices when the transisiton to lead-free solder happened. Millions of faulty products and very little coverage. A bit of extra attention paid to the basic of electronics (like which solder to chose) would have been a good idea at the time but Nvidia were to busy boasting about their software prowess.
 

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
To some extent I'm hoping Intel's software support team will prioritize media, machine learning and application specific compute over games anyway. Their design seems better for that than RDNA. And in those areas Nvidia really needs a competitor. RDNA is competitive enough at gaming.

I mean, that's why AMD has the separate CDNA architecture (and Intel has Xe-HPC). These are aimed squarely at graphics.
 

gdansk

Golden Member
Feb 8, 2011
1,973
2,353
136
I mean, that's why AMD has the separate CDNA architecture (and Intel has Xe-HPC). These are aimed squarely at graphics.
No, I guess I didn't convey what I mean. It isn't about HPC. And Xe HPG's double precision throughput seems a bit weird for a "focused on graphics" GPU.

Nvidia sells consumer cards that do it all. Game, transcode, compute, ML, and so on. RDNA2 doesn't really compete because of software support (even the bare minimum of ROCm). But RDNA2 does gaming so well it has a market at the right prices.

Xe HPG will have a hard time doing the same if their drivers are as bad as described. Even if cheaper. What good is a discount GPU if it only works well for half your games? Maybe Intel can prioritize areas where Nvidia doesn't have any real competition yet.

For example, if Intel pushes DG2 media transcoder support into ffmpeg, etc then the 128EU model is a compelling media box upgrade.

Or another example, if Intel works on good tensorflow and torch support for their consumer cards then the 512/384EU could be a compelling alternative to Nvidia for low-end ML.

And surely either of those tasks is easier than adding fixes for 1000s of games to bring their driver support up to Radeon or GeForce levels.
 
Last edited:

DiogoDX

Senior member
Oct 11, 2012
746
277
136
Performance seems promisse but I am still curious to see how stable the drivers will be.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
That, and software shaders. It wasn't until the later DX10 compatible Gen4 Intel GMA had hardware support for SM3.0/DX9c. Which was effectively the standard API then.

GMA X3000 I mentioned is Gen 5. Yes, Gen 4 had software shaders but X3000 had unified shaders. Also Gen 4 had partial DX9 support. X3000 had hardware for DX10 but it was enabled in a revision(probably a bug) in the GMA X3500 successor.

Feature-wise X3000 was pretty good. It's not enough when the performance sucks.

The X3000 may have gotten fancy schmancy DX10 unified shaders, but it went from 4 pixel/1 texel per pixel pipe config to an equivalent 2 pixel/4 texel pipe. And while it may have had hardware T&L, the performance was often worse than CPU, so it had the bare minimum geometry.

It's like going from a RTX 2080 Ti to a RTX 3060 Ti and wondering why the performance is worse despite the more advanced features.

No, I guess I didn't convey what I mean. It isn't about HPC. And Xe HPG's double precision throughput seems a bit weird for a "focused on graphics" GPU.

Tell me you guys aren't really arguing about HPG potentially sucking because it has just 2x the DP throughput in Sismark bench compared to 3070 Ti!

HPG's DP throughput according to that result is 1/20th of SP. That's insignificant. That kind of performance is emulation-level, not some extra hardware they added. Maybe the particular architecture of Intel iGPUs enable it to perform better on DP FP emulation under Sismark compared to Nvidia. This tells us nothing.
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
GMA X3000 I mentioned is Gen 5. Yes, Gen 4 had software shaders but X3000 had unified shaders. Also Gen 4 had partial DX9 support. X3000 had hardware for DX10 but it was enabled in a revision(probably a bug) in the GMA X3500 successor.

Not to quibble too much here, but the X3000 is most definitely a Gen4 design. Gen5 is Westmere's "HD" Graphics.

Anyway Gen4 was a real mess. The 3000 is actually only hardware SM2.0 capable (Direct3D FL9_1), rest is handled by software. The X3000 has hardware SM3.0, but is still only FL9_1. Only the X3500 and later 4000-models have full DX10/FL10_0 support.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Not to quibble too much here, but the X3000 is most definitely a Gen4 design. Gen5 is Westmere's "HD" Graphics.

Yup you are right, Gen 4 is X3000, so the predecessor is Gen 3 which is GMA 9xx series, and you have Extreme Graphics 1 and 2 for Gen 1 and 2. The GMA 9xx series is the one that had partial DX9 support with hardware pixel and software vertex shaders, which goes back to the original point.

Yes, I think the X3000 always had DX10 support but were dealing with lots of bugs in hardware and had to wait for X3500. The lack of changes between two chips(other than DX) is the biggest hint. The non-X 3000 models were rebranded GMA 950 chips. AFAIK all the 4000 models moved to full DX10.

They dealt with weak geometry in two ways and two generations. The 4000 series outright doubled geometry performance. The HD Graphics series introduced proper occlusion culling which further assisted and improved it. Not directly but likely was the bottleneck.

Gen 6 was probably the bare minimum for catching up with competitors. Gen 7 is the minimum.

I owned and played with Extreme Graphics 2, X3000, HD Graphics, and HD Graphics 3000. They were daily drivers for me. EG2 and X3000 were my WoW days.

You guys think their iGPUs are crap today. The Extreme Graphics 2 system had trouble running 3D RTS games like Command & Conquer Generals(I think). By that I know when people cry foul they are with the modern trend of exaggerating anything and everything.
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
I mean, that's why AMD has the separate CDNA architecture (and Intel has Xe-HPC). These are aimed squarely at graphics.

True but better software support for compute especially in commonly used ML libraries like tensorflow would be very appreciated as it stops one from needing to have an NV gpu which usually are at the same time limited in RAM unless you want to pay $5000+ for the AI card. RDNA would suffuce as well for basci prototyping but software is lacking.
 
  • Like
Reactions: Tlh97 and NTMBK

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
RDNA is competitive enough at gaming.

Tell AMD to go make some more of them then? Meanwhile, the rest of us are hoping Intel's offerings are "competitive enough" at gaming to at least put some more SKUs on the market that people can by at reasonable prices.
 
Jul 27, 2020
15,740
9,809
106
In a rare display of solidarity, today Nvidia, AMD and Intel signed a joint agreement to end, once and for all, the GPU shortage plaguing the gaming industry for years.

Called "One Gamer, One GPU", gamers will be shipped the card of their choice when they pay MSRP for the card at https://onegameronegpu.com. In order to register, a gamer will have to fulfill the following requirements:

Have a valid digital store ID for Steam, Epic Games Store, Ubisoft or Origin.

Have minimum 100 hours of gameplay time.

Must be at least 18 years old. Younger gamers will need to submit the Parental Consent form.

Gamers will need to adhere to the following rules:

Reselling the graphics card is NOT allowed. In case the gamer wishes to return the graphics card to get a different model, the gamer will need to submit a request for upgrade at OneGamerOneGPU website. Once the request is approved, the gamer will be given the choice to send the graphics card back or schedule a pick-up at a time most convenient for them.

Gamers found to be violating this rule will be banned for a minimum of 3 years. In addition, the gamer using the resold card will be unable to do so on the listed digital stores once the card is flagged as "stolen". The illegal gamer will also be subject to a ban of minimum 3 years at OneGamerOneGPU.com and the digital stores may, at their discretion, limit the privileges or outright ban the offending gamer.

Raja Koduri, Vice President of Intel GPU Technology Group, had this to say, "We believe in a world where everyone has the right to game. Unfortunately, this had become next to impossible due to certain unscrupulous actors amassing graphics cards for the explicit purpose of depriving gamers of their rights, just because they had the means to do so. Today, we have changed the status quo and brought freedom back to the gaming industry. We will continue in our efforts to best serve gamers everywhere".

Jensen Huang, CEO of Nvidia, expressed great delight on this occasion, "We have always stood up for gamers. We owe our very existence to the passionate gamers who, despite suffering great difficulties in real life, never compromised on their gaming ambitions. This is truly a momentous step towards making gaming a part of our human DNA".

Lisa Su, CEO of AMD, speaking on behalf of her company's GPU division, exuded immense confidence in the success of this grand venture. "Gamers everywhere! Today we are united as ONE! Let us never again be forced to sell our homes to get our daily gaming fix. To game is to live and to live is to game. We will prevail as the most advanced species in the entire universe because gaming will take us there by evolving our minds to the ultimate level. Only with gaming do we stand a chance to bring about the Singularity. Game we must and Game we will!".
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Yup you are right, Gen 4 is X3000, so the predecessor is Gen 3 which is GMA 9xx series, and you have Extreme Graphics 1 and 2 for Gen 1 and 2. The GMA 9xx series is the one that had partial DX9 support with hardware pixel and software vertex shaders, which goes back to the original point.

Yes, I think the X3000 always had DX10 support but were dealing with lots of bugs in hardware and had to wait for X3500. The lack of changes between two chips(other than DX) is the biggest hint. The non-X 3000 models were rebranded GMA 950 chips. AFAIK all the 4000 models moved to full DX10.

The basic 3000 had HW bugs, so higher then SM2.0 was intentionally disabled. X3000 had software SM3.0, so a small improvement. The X3500 was the first one with full HW SM3.0.

I owned and played with Extreme Graphics 2,...

Oh, <deity\>. I sympathise. I really do. I had the displeasure to work on one of those for a while. It wouldn't even work properly with basic GDI+, VGA output was beyond poor and 3D issues galore.
 

AnitaPeterson

Diamond Member
Apr 24, 2001
5,940
387
126
In order to register, a gamer will have to fulfill the following requirements:

Have a valid digital store ID for Steam, Epic Games Store, Ubisoft or Origin.

Have minimum 100 hours of gameplay time.

Must be at least 18 years old. Younger gamers will need to submit the Parental Consent form.


Gosh... what a cretinous idea!

First of all, this will be US-only.

Second of all, to demand a "valid digital store ID for Steam, Epic Games Store, Ubisoft or Origin" is basically like saying "You can't be a cinephile unless you have a Netflix or Amazon account".
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
The basic 3000 had HW bugs, so higher then SM2.0 was intentionally disabled. X3000 had software SM3.0, so a small improvement. The X3500 was the first one with full HW SM3.0.

Sorry you are wrong there. The 3000 is exactly GMA 950. It behaved very similar to the GMA 950. I did lots of back and forth with owners and personal research. The fillrate is 4/1 just like the GMA 950. If it was based on Gen 4 it'd be 2/2. Also the Gen 3 used immediate mode tiling, which resulted in fillrate tests equal to theoretical maximum and that's what 3000 was. The non-tiling architectures like Gen 4 shows much lower than theoretical.

The X3000 was originally meant for full DX10 support. That's why the X3500's only change was DX10, since all it did was fix hardware bugs.

Oh, <deity\>. I sympathise. I really do. I had the displeasure to work on one of those for a while. It wouldn't even work properly with basic GDI+, VGA output was beyond poor and 3D issues galore.

No regrets there at all. They have a special place in my heart. I went out of my way to get them. I rooted for them. The Gen series really grew up. *Sniff*. Haha.
 
Last edited:

Dayman1225

Golden Member
Aug 14, 2017
1,152
973
146
[
View attachment 57001

Well that explains the delays. Their GPU's and/or graphics drivers are full of security holes.

Intel 2021 Product Security Report
As per this article 23 of those GFX vulnerabilities are from the Vega graphics in Kaby Lake -G

Intel notes that the CVE INTEL-SA-00481 for Intel Core Processors with Radeon RX Vega M graphics features 23 vulnerabilities for AMD's components. Those appear to be for Intel's Kaby Lake-G processors, which paired 8th Gen Intel Core processors with AMD's Radeon graphics
 

ryanjagtap

Member
Sep 25, 2021
108
127
96
[

As per this article 23 of those GFX vulnerabilities are from the Vega graphics in Kaby Lake -G
Well, there will always be vulnerabilities with such low support from Intel and AMD for the product. As far as I know there was not much driver support for Kaby Lake-G products.