No offense but Intel has been making drivers for their GPUs for 20+ years. They just haven't focused in games (for obvious reasons). They have been putting in more effort lately though.
Intels drivers have been bad for gaming since the original "Extreme" Graphics*. Those where bad. They even had the CPU handle part of the rendering pipeline back then. Intel hasn't really helped themselves by leaving previous generations to wither on the wine, while focusing on new and shiny. They've done that every CPU generation until Skylake. Where they finally made a unified driver package going forward.By all accounts, they've been doing a pretty lousy job for 20+ years.
Those IGPs where added there just because they had extra room on the chipset. I dont think they ever expected such a big focus would be placed on the need of IGPs in desktops.Intels drivers have been bad for gaming since the original "Extreme" Graphics*. Those where bad. They even had the CPU handle part of the rendering pipeline back then. Intel hasn't really helped themselves by leaving previous generations to wither on the wine, while focusing on new and shiny. They've done that every CPU generation until Skylake. Where they finally made a unified driver package going forward.
It was only after the IGP security debacle they released new drivers for Gen7(.5) and Gen8 after all. With a hefty performance penalty.
*The original "Extreme" Graphics (2) couldn't even render the Windows desktop properly (GDI+ decelerator). And was very often accompanied by... ahhh... "poor"... RAMDACs.
that was a hardware limitation, it didn't support vertex shaders in hardware like all other GPUs did (DX7, after the Geforce 256 and Radeon), but games required it so it did in software, up until the X3100 and 4500..Intels drivers have been bad for gaming since the original "Extreme" Graphics*. Those where bad. They even had the CPU handle part of the rendering pipeline back then. Intel hasn't really helped themselves by leaving previous generations to wither on the wine, while focusing on new and shiny. They've done that every CPU generation until Skylake. Where they finally made a unified driver package going forward.
It was only after the IGP security debacle they released new drivers for Gen7(.5) and Gen8 after all. With a hefty performance penalty.
*The original "Extreme" Graphics (2) couldn't even render the Windows desktop properly (GDI+ decelerator). And was very often accompanied by... ahhh... "poor"... RAMDACs.
The SiS530 really was the starting point for integrating as much as possible on the chipset. For cost cutting, but nobody minded that at the time since it enabled much cheaper systems. I remember it well, but it wasn't really geared for anything to do with 3D. The original nForce on the other hand. Now there was a good chipset.If i remember well SiS started the IGP thing with the highly integrated Sis 530 Socket 7 motherboards that had IGP, sound and even ethernet... that was amazing, at that time, i still have a working one stored, that may be the first ever comercial motherboard with IGP because i dont remember a earlier one. Then Intel followed with the 810 chipset (i belive), that started the mess of Extreme Graphics, with VIA joining shortly after buying Savage.
The software vertex shaders was only GMA900 and newer, until the GMA3150. The original Extreme Graphics used a lot of other software tricks, and had the CPU actually control the rendering pipeline too. Which was just what you needed on an already overloaded single core. Worse these things were usually paired with budget CPUs, so performance was abysmal. It's buried somewhere in the documentation, but I can't remember where.that was a hardware limitation, it didn't support vertex shaders in hardware like all other GPUs did (DX7, after the Geforce 256 and Radeon), but games required it so it did in software, up until the X3100 and 4500..
If I remember rightly, Intel started with integrated graphics because they just needed to do something with the spare die area on the northbridge. The chip had a minimum size due to IO requirements, and hey, why not throw some graphics in there?The SiS530 really was the starting point for integrating as much as possible on the chipset. For cost cutting, but nobody minded that at the time since it enabled much cheaper systems. I remember it well, but it wasn't really geared for anything to do with 3D. The original nForce on the other hand. Now there was a good chipset.
I had a feeling up until some point they wanted iGPUs to be like HD Audio - barely featured and minimal die space.If I remember rightly, Intel started with integrated graphics because they just needed to do something with the spare die area on the northbridge. The chip had a minimum size due to IO requirements, and hey, why not throw some graphics in there?
It was the X4500/4500MHD that had acceptable hardware vertex shaders/T&L. The one in GMA X3000/3500 was lower performing than the CPU, so in some games it slowed down when you had it enabled.up until the X3100 and 4500..
Was that the SiS630 by any chance? I had a laptop with it, and it could run most things.i used to game with a SiS IGP on the motherboard back in the day with my Celeron D single core 2.53GHz (early 2000s).
SiS530(Socket 7) and SiS620(Slot 1) had a SiS6306 integrated, what im petty sure that its just the integrated version of the 6326, with DX5 support, it was still able to handle some games, and had the MPEG-2 decoder. Some boards even had 8MB of dedicated memory for the iGPU. So the iGPU supported both shared and dedicated ram.The SiS530 really was the starting point for integrating as much as possible on the chipset. For cost cutting, but nobody minded that at the time since it enabled much cheaper systems. I remember it well, but it wasn't really geared for anything to do with 3D. The original nForce on the other hand. Now there was a good chipset.
it was a socket 478 Celeron D 2.53Ghz CPU in there, so according to this list:Was that the SiS630 by any chance? I had a laptop with it, and it could run most things.
When something can't even do basic Windows desktop correctly, that's bad in my book. Especially when the CPU was loaded Extreme Graphics (both the 1 and 2 variety, although 2 was way better behaved) frequently glitched out. Windows not drawn, not updated, weird graphics glitches, locked up without updating and so on. That's before starting on the ultra crappy RAMDACs OEMs insisted on using with it. Which is pretty important when you only have VGA output.It wasnt THAT bad, even with software features, it arrived two years before the SIS 315 and the nForce with the Geforce 2 IGP.
It's true, but having a dGPU will really ramp things up.By all accounts, they've been doing a pretty lousy job for 20+ years.
Let's hope so. Though to be honest, I would not be too upset if DG2's drivers suck for GPGPU. Not that I'm in the market for one, mind you. But I think it would be okay for the market as a whole if that happened.But with Xe they are ready now. In no Intel GPU history they were ready in any shape or form to get proper dGPUs out.
They will get there.Let's hope so. Though to be honest, I would not be too upset if DG2's drivers suck for GPGPU. Not that I'm in the market for one, mind you. But I think it would be okay for the market as a whole if that happened.
At least with their iGPUs, it's the OpenCL drivers. Mining software relies on CUDA for NV cards and OpenCL for anything else. Intel's existing OpenCL drivers are just awful. Even if the hardware is capable, you wouldn't know it.It may not be the case of Intel's GPGPU being bad as simply crypto software not existing.
There are more GPU-mineable algos out there than just Ethereum. Also I'm pretty sure Ethereum Classic (guffaw) will stay on PoW. But I could be wrong.How much will you really benefit from mining on an Intel GPU if the timeframe is that short?
I fear it will work just fine for modern games but not for many older ones.But once you start selling it as an add on card with nearly a magnitude better performance, well people will care.
As a GoG gamer, I concur. Some old games were written when Intel IGP's were so bad, they check for Intel GPUs and refuse to run if present.I fear it will work just fine for modern games but not for many older ones.
Do you guys remember when Intel came out with graphics cards, or at least graphics chips, and were trying to take on NVIDIA (and maybe 3DFX) in the 90s? What was their chip called? It got its ass handed to it mightily lol. Hope this time around they do a good bit better!!!
![]()
Intel740 - Wikipedia
en.wikipedia.org
One of the first AGP video cards on the market. I think the Riva 128 was earlier though.
Ah, yes, the i740!
Good old times. read that again. less than $50! And we pay like >$100 for a 710 GT right now.The i740 was released in February 1998, at $34.50 in large quantities
What are you talking about? It supported 3D shockwave effects! xDDo you guys remember when Intel came out with graphics cards, or at least graphics chips, and were trying to take on NVIDIA (and maybe 3DFX) in the 90s? What was their chip called? It got its ass handed to it mightily lol. Hope this time around they do a good bit better!!!
Yeah the whole concept of AGP using the main memory was bad, BUT AGP allowed for graphics chips to be placed outside of the 133MB/s shared bandwidth of the PCI bus, that alone was a huge win.Ah, yes, the i740! Damn, I remember it was bad, but the wikipedia link reminded me of just how bad. I remember the whole AGP hype - who needs memory when you have the fat AGP pipeline!!!! LOL, fun times!!! Thanks DrMrLordX!!!
Thread starter | Similar threads | Forum | Replies | Date |
---|---|---|---|---|
![]() |
Intel readies ATX 12VHPWR connector revision | Graphics Cards | 9 | |
Y | Info Intel DG1 PROTOTYPE | Graphics Cards | 1 | |
K | Question Intel ARC A770 Performance and idle power draw after Feb. 2023 driver update | Graphics Cards | 43 | |
Y | Article Feast of Intel Larrabee | Graphics Cards | 26 | |
I | Question Nvidia's answer to Intel's oneAPI | Graphics Cards | 2 |