• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

News Intel to develop discrete GPUs

Page 40 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

guidryp

Golden Member
Apr 3, 2006
1,528
1,711
136
Yep, opportune moment where they could have named their price and sold all they could build.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,279
822
126
No offense but Intel has been making drivers for their GPUs for 20+ years. They just haven't focused in games (for obvious reasons). They have been putting in more effort lately though.
By all accounts, they've been doing a pretty lousy job for 20+ years.
Intels drivers have been bad for gaming since the original "Extreme" Graphics*. Those where bad. They even had the CPU handle part of the rendering pipeline back then. Intel hasn't really helped themselves by leaving previous generations to wither on the wine, while focusing on new and shiny. They've done that every CPU generation until Skylake. Where they finally made a unified driver package going forward.

It was only after the IGP security debacle they released new drivers for Gen7(.5) and Gen8 after all. With a hefty performance penalty.

*The original "Extreme" Graphics (2) couldn't even render the Windows desktop properly (GDI+ decelerator). And was very often accompanied by... ahhh... "poor"... RAMDACs.
 
  • Like
Reactions: PingSpike and Tlh97

Shivansps

Diamond Member
Sep 11, 2013
3,405
1,004
136
Intels drivers have been bad for gaming since the original "Extreme" Graphics*. Those where bad. They even had the CPU handle part of the rendering pipeline back then. Intel hasn't really helped themselves by leaving previous generations to wither on the wine, while focusing on new and shiny. They've done that every CPU generation until Skylake. Where they finally made a unified driver package going forward.

It was only after the IGP security debacle they released new drivers for Gen7(.5) and Gen8 after all. With a hefty performance penalty.

*The original "Extreme" Graphics (2) couldn't even render the Windows desktop properly (GDI+ decelerator). And was very often accompanied by... ahhh... "poor"... RAMDACs.
Those IGPs where added there just because they had extra room on the chipset. I dont think they ever expected such a big focus would be placed on the need of IGPs in desktops.

If i remember well SiS started the IGP thing with the highly integrated Sis 530 Socket 7 motherboards that had IGP, sound and even ethernet... that was amazing, at that time, i still have a working one stored, that may be the first ever comercial motherboard with IGP because i dont remember a earlier one. Then Intel followed with the 810 chipset (i belive), that started the mess of Extreme Graphics, with VIA joining shortly after buying Savage.

It started to get A LOT better with Sandy Bridge, a lot really, back in 2012, when i had all my gpus mining bitcoin 24/7 i started to play with the Intel HD3000 IGP, i also had a GT520 that a friend gifted it to me, and the HD3000 when overclocked to 1.5Ghz outperformed the GT520 is most games. At 1800mhz and DDR3-1600 it managed to beat it petty much always and by some big margin, i still have the comparison and all the pictures i posted on another forum that i just went to check it out... it even managed to play some games at 1080p, wow.
 

SPBHM

Diamond Member
Sep 12, 2012
5,000
358
126
Intels drivers have been bad for gaming since the original "Extreme" Graphics*. Those where bad. They even had the CPU handle part of the rendering pipeline back then. Intel hasn't really helped themselves by leaving previous generations to wither on the wine, while focusing on new and shiny. They've done that every CPU generation until Skylake. Where they finally made a unified driver package going forward.

It was only after the IGP security debacle they released new drivers for Gen7(.5) and Gen8 after all. With a hefty performance penalty.

*The original "Extreme" Graphics (2) couldn't even render the Windows desktop properly (GDI+ decelerator). And was very often accompanied by... ahhh... "poor"... RAMDACs.
that was a hardware limitation, it didn't support vertex shaders in hardware like all other GPUs did (DX7, after the Geforce 256 and Radeon), but games required it so it did in software, up until the X3100 and 4500..

also all these IGPs were so slow for gaming, can't really blame bad drivers,

not to say Intel was up to Nvidia/ATI standards at all, their releases were always slow and didn't really get that much in terms of gaming optimization, but as I said the hardware was so slow that...

my experience with the OG Intel IGP was actually pretty decent in 2000, I could play the latest games pretty OK at low settings, things like NFS porsche, counter strike, even quake 3 I remember playing fairly ok, I even remember doing driver updates that actually fixed issues with games back then (nfs porsche), oh Rage Incoming worked great, I think it even has issues with the geforce from the time!

but yes, with the HD graphics line I think they started a pretty good jump in quality of the hardware/software, the current XE stuff is looking pretty decent on the 11th gen IGPs.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,279
822
126
If i remember well SiS started the IGP thing with the highly integrated Sis 530 Socket 7 motherboards that had IGP, sound and even ethernet... that was amazing, at that time, i still have a working one stored, that may be the first ever comercial motherboard with IGP because i dont remember a earlier one. Then Intel followed with the 810 chipset (i belive), that started the mess of Extreme Graphics, with VIA joining shortly after buying Savage.
The SiS530 really was the starting point for integrating as much as possible on the chipset. For cost cutting, but nobody minded that at the time since it enabled much cheaper systems. I remember it well, but it wasn't really geared for anything to do with 3D. The original nForce on the other hand. Now there was a good chipset.

that was a hardware limitation, it didn't support vertex shaders in hardware like all other GPUs did (DX7, after the Geforce 256 and Radeon), but games required it so it did in software, up until the X3100 and 4500..
The software vertex shaders was only GMA900 and newer, until the GMA3150. The original Extreme Graphics used a lot of other software tricks, and had the CPU actually control the rendering pipeline too. Which was just what you needed on an already overloaded single core. Worse these things were usually paired with budget CPUs, so performance was abysmal. It's buried somewhere in the documentation, but I can't remember where.
 

NTMBK

Diamond Member
Nov 14, 2011
9,472
3,010
136
The SiS530 really was the starting point for integrating as much as possible on the chipset. For cost cutting, but nobody minded that at the time since it enabled much cheaper systems. I remember it well, but it wasn't really geared for anything to do with 3D. The original nForce on the other hand. Now there was a good chipset.
If I remember rightly, Intel started with integrated graphics because they just needed to do something with the spare die area on the northbridge. The chip had a minimum size due to IO requirements, and hey, why not throw some graphics in there?

And now apparently AMD might be about to do the same with their IOD in Zen chips... Time is a flat circle
 

Borealis7

Platinum Member
Oct 19, 2006
2,824
151
106
i used to game with a SiS IGP on the motherboard back in the day with my Celeron D single core 2.53GHz (early 2000s). many games did not require a discrete GPU back then and the CPU had a much larger impact than today. around 2006-7 is when things started getting more complex and an IGP simply was not an option anymore.
i think the last and most demanding game i played back then was Dungeon Siege 2 on med-low settings.
 

IntelUser2000

Elite Member
Oct 14, 2003
7,643
2,525
136
If I remember rightly, Intel started with integrated graphics because they just needed to do something with the spare die area on the northbridge. The chip had a minimum size due to IO requirements, and hey, why not throw some graphics in there?
I had a feeling up until some point they wanted iGPUs to be like HD Audio - barely featured and minimal die space.

Of course it went the whole another way.

up until the X3100 and 4500..
It was the X4500/4500MHD that had acceptable hardware vertex shaders/T&L. The one in GMA X3000/3500 was lower performing than the CPU, so in some games it slowed down when you had it enabled.

Oh, and X3000 abandoned the immediate mode tile rendering the Extreme Graphics and GMA generation used and went backwards on the texture fillrate performance.

X3000 was when Intel actually started putting in effort.

4500 - Fixed hardware vertex shaders/T&L performance
HD Graphics(Clarkdale) - First implementation of occlusion culling along with on-package GMCH significantly improving performance
HD Graphics 2000/3000(Sandy Bridge) - LLC cache sharing, doubled FP performance per EU was another huge jump
HD Graphics 4000(Ivy Bridge) - Another doubled FP performance per EU

People blamed X3000's hardware T&L performance and Ivy Bridge's sometimes lower than expected performance on drivers. Actually X3000 didn't have enough hardware, and Ivy Bridge had a flaw where the GPU couldn't boost properly.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,405
1,004
136
The SiS530 really was the starting point for integrating as much as possible on the chipset. For cost cutting, but nobody minded that at the time since it enabled much cheaper systems. I remember it well, but it wasn't really geared for anything to do with 3D. The original nForce on the other hand. Now there was a good chipset.
SiS530(Socket 7) and SiS620(Slot 1) had a SiS6306 integrated, what im petty sure that its just the integrated version of the 6326, with DX5 support, it was still able to handle some games, and had the MPEG-2 decoder. Some boards even had 8MB of dedicated memory for the iGPU. So the iGPU supported both shared and dedicated ram.

They ended up replacing it with the SiS 540 Socket 7 that had the new SiS300 (DX6) and SIS 630 on SLOT 1 with the SiS305. I saw a lot of SiS 530, it were very popular for office pcs. Looking in the internet now, the first IGP was the SiS 5596 on Socket 5 and 7 in 1996, never saw one.

The first ever Intel chipset with a iGPU, the 810, a integrated the Intel 752 dGPU, i never saw comparisons but the 810 should be similar to the SIS630/SIS305. It had DX6.0 support, and a fake DX8.0 support (missing hardware features implemented in software). It wasnt THAT bad, even with software features, it arrived two years before the SIS 315 and the nForce with the Geforce 2 IGP.

The real problem was that it took until the GMA4000 to do a proper fully hardware featured IGP.
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,279
822
126
It wasnt THAT bad, even with software features, it arrived two years before the SIS 315 and the nForce with the Geforce 2 IGP.
When something can't even do basic Windows desktop correctly, that's bad in my book. Especially when the CPU was loaded Extreme Graphics (both the 1 and 2 variety, although 2 was way better behaved) frequently glitched out. Windows not drawn, not updated, weird graphics glitches, locked up without updating and so on. That's before starting on the ultra crappy RAMDACs OEMs insisted on using with it. Which is pretty important when you only have VGA output.

The first really successful Intel IGP was the GMA900 IMO. It worked correctly, and even had enough performance to run most games of the era. Paired with a Pentium M, it was a potent combination for laptops. You could even use it with a Pentium M on the desktop with an appropriate board like the Aopen i915GMM-HFS. My old proto-HTPC was powered by it back before HD was a thing.
 

IntelUser2000

Elite Member
Oct 14, 2003
7,643
2,525
136
By all accounts, they've been doing a pretty lousy job for 20+ years.
It's true, but having a dGPU will really ramp things up.

Because of the limited performance even Xe doesn't get full focus. But once you start selling it as an add on card with nearly a magnitude better performance, well people will care.

But with Xe they are ready now. In no Intel GPU history they were ready in any shape or form to get proper dGPUs out.
 

DrMrLordX

Lifer
Apr 27, 2000
18,172
7,076
136
But with Xe they are ready now. In no Intel GPU history they were ready in any shape or form to get proper dGPUs out.
Let's hope so. Though to be honest, I would not be too upset if DG2's drivers suck for GPGPU. Not that I'm in the market for one, mind you. But I think it would be okay for the market as a whole if that happened.
 

IntelUser2000

Elite Member
Oct 14, 2003
7,643
2,525
136
Let's hope so. Though to be honest, I would not be too upset if DG2's drivers suck for GPGPU. Not that I'm in the market for one, mind you. But I think it would be okay for the market as a whole if that happened.
They will get there.

It might also be that there's zero Intel GPU that's worth mining so the software base doesn't care. Why, you want to mine at 2MH/s for 50W? Who wants that? You can do mining on an AMD iGPU since it's based off the dGPU.

It may not be the case of Intel's GPGPU being bad as simply crypto software not existing.

We're in Phase 0 of Ethereum 2.0. We need Phase 1.5 then PoW will merge with PoS. Let's say Phase 1 is early next year, so Phase 1.5 could be mid to end of next year.

So let's say October for first Intel dGPU, and Jan-Feb for real availability. And maybe good mining software will be ready by March?

How much will you really benefit from mining on an Intel GPU if the timeframe is that short?
 

DrMrLordX

Lifer
Apr 27, 2000
18,172
7,076
136
It may not be the case of Intel's GPGPU being bad as simply crypto software not existing.
At least with their iGPUs, it's the OpenCL drivers. Mining software relies on CUDA for NV cards and OpenCL for anything else. Intel's existing OpenCL drivers are just awful. Even if the hardware is capable, you wouldn't know it.

How much will you really benefit from mining on an Intel GPU if the timeframe is that short?
There are more GPU-mineable algos out there than just Ethereum. Also I'm pretty sure Ethereum Classic (guffaw) will stay on PoW. But I could be wrong.
 

SaltyNuts

Platinum Member
May 1, 2001
2,027
167
106
Do you guys remember when Intel came out with graphics cards, or at least graphics chips, and were trying to take on NVIDIA (and maybe 3DFX) in the 90s? What was their chip called? It got its ass handed to it mightily lol. Hope this time around they do a good bit better!!!
 

DrMrLordX

Lifer
Apr 27, 2000
18,172
7,076
136
Do you guys remember when Intel came out with graphics cards, or at least graphics chips, and were trying to take on NVIDIA (and maybe 3DFX) in the 90s? What was their chip called? It got its ass handed to it mightily lol. Hope this time around they do a good bit better!!!

One of the first AGP video cards on the market. I think the Riva 128 was earlier though.
 
  • Like
Reactions: lightmanek

SaltyNuts

Platinum Member
May 1, 2001
2,027
167
106

One of the first AGP video cards on the market. I think the Riva 128 was earlier though.


Ah, yes, the i740! Damn, I remember it was bad, but the wikipedia link reminded me of just how bad. I remember the whole AGP hype - who needs memory when you have the fat AGP pipeline!!!! LOL, fun times!!! Thanks DrMrLordX!!!
 

psolord

Golden Member
Sep 16, 2009
1,362
460
136
Do you guys remember when Intel came out with graphics cards, or at least graphics chips, and were trying to take on NVIDIA (and maybe 3DFX) in the 90s? What was their chip called? It got its ass handed to it mightily lol. Hope this time around they do a good bit better!!!
What are you talking about? It supported 3D shockwave effects! xD

Also they didn't have Raja Conjuri back then. I'm pretty sure he will conjure something great of evergreen proportions once again! :)
 

Shivansps

Diamond Member
Sep 11, 2013
3,405
1,004
136
Ah, yes, the i740! Damn, I remember it was bad, but the wikipedia link reminded me of just how bad. I remember the whole AGP hype - who needs memory when you have the fat AGP pipeline!!!! LOL, fun times!!! Thanks DrMrLordX!!!
Yeah the whole concept of AGP using the main memory was bad, BUT AGP allowed for graphics chips to be placed outside of the 133MB/s shared bandwidth of the PCI bus, that alone was a huge win.

That failed concept continued with PCI-E and now GPUs have "virtual VRAM", it is just slow. It may be more viable with DDR5 and old GPUs, imagine old DDR3/DDR5 gpus OR the GT1030 DDR4 performing better with a DDR5 motherboard. I would be very interested to see how a 128bit GDDR5 gpu like the GTX750TI/GTX950/GTX960 works with a DDR5 motherboard in theory you could have a lot more bandwidth since you can access ram and still use the VRAM at the same time, and not longer be limited to 1/2GB of VRAM. But i dont think the driver is smart enoght for that.
-nevermind, i just remebered it has to go trought PCI-E.

What was a major fail was the 3.3v and 1.5v AGP incompatibility thing, you could have AGP cards that were incompatible with the AGP port on some boards...
 
Last edited:

ASK THE COMMUNITY