News Intel GPUs - Intel launches A580

Page 49 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
Intel is the only graphics company who cares about OpenCL these days. It's not about the driver, the mining software devs doesn't care for iGPUs.

Apparently, the mining software that would work in Intel iGPUs broke in a driver update either last year or earlier this year.
 

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
Doesn't Intel have their OneAPI thingy as CUDA competitor? I'd assume that developers would use that if they end up making a mining client for Arc GPUs
If it pans out, this is a more of a high end engineering & development workstations/HPC server bit. It won't affect us plebs.
 

Tup3x

Senior member
Dec 31, 2016
944
925
136
Well, I guess it will depend on whether or not Arc GPUs are meant for pure gaming only.
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
It looks like Intel is committing to a 2022 launch:
Intel’s Raja Koduri confirms that the company is partnering with the likes of Asus, Gigabyte, MSI, and other OEMs for the release of its upcoming Alchemist GPU 2022 launch.

The unreliable rumor blog wccftech is claiming Q1 next year:

that feels fluffy to me. I expect Intel will have GPUs being sent to board makers in Q1, but I would be very surprised if we see any products prior to Q2. I also question if we will see any real quantity of these, it is after all yet another GPU being made at TSMC.
 
Last edited:
  • Like
Reactions: Tlh97 and mikk

mikk

Diamond Member
May 15, 2012
4,111
2,105
136
I mean Intel confirmed the launch to be in Q1, although the exact timeline is still unknown. If they launch at CES early January I fully expect the first cards in store the same quarter.
 
  • Like
Reactions: Tlh97 and Leeea

Glo.

Diamond Member
Apr 25, 2015
5,657
4,409
136
Intel GPUs should be in High Volume Manufacturing, already. There is no reason why they would not launch in Q1, albeit - late Q1.
 

KompuKare

Golden Member
Jul 28, 2009
1,012
923
136
Awesome! Hope Intel maintains this trend of breaking mining software :D. This will give them a leg up on building reputation in the gaming market.
Maybe if they only break mining software without breaking any games or even other OpenCL programs.

Stopping mining by accidentally breaking drivers and then not fixing them is a bit like security by obscurity though: all it takes is enough people (like miners) showing enough interest to fix the problem.
 

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
Maybe if they only break mining software without breaking any games or even other OpenCL programs.

Stopping mining by accidentally breaking drivers and then not fixing them is a bit like security by obscurity though: all it takes is enough people (like miners) showing enough interest to fix the problem.
Well, as I've noted elsewhere, there needs to be a hardware tie in that causes modified firmware to brick the card. No fun for enthusiasts who like to play with firmware (GPU BIOS), but a trade that worth it IMHO.
In that case, the algorithm would need to be changed - and that's a much bigger deal.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
The only thing the gpu needs to do is to check the bandwidth avalible... it is stupid to block 1 mining algo, it is stupid to do a "50%", it is stupid to requiere a "display connected". Nvidia had the right idea with the pci-e link check on the 3060, but they never really intended to block their gpus from mining.

If the pcie link is less than x8, GPU compute does not work, thats it. That alone reduces the amount of gpus that can be used by miners whiout completely killing your mining market or screw a gamer that wants to make an extra buck by using the card they paid for.

off course eventually some expensive risers will show up with a x1 to x8 bridge on it... and this can be still be handled by checking the actual link speed at boot up.
 
Last edited:
  • Like
Reactions: psolord

Glo.

Diamond Member
Apr 25, 2015
5,657
4,409
136


So the 128 EU DG2 design is: 1024 ALUs, 32 ROPs, 96 bit GDDR6 bus.
 

PingSpike

Lifer
Feb 25, 2004
21,729
559
126
I mean Intel confirmed the launch to be in Q1, although the exact timeline is still unknown. If they launch at CES early January I fully expect the first cards in store the same quarter.

I don't know about you, but when I read "Q1 Launch" I picture a possibly working single piece of product, on March 31st at 11:58pm, being launched via rocket. The rocket may then explode on impact, destroying the product and anyone or thing in the area but since it still was technically launched a marketing droid will croon on about the company's successful launch the next day.
 
  • Haha
Reactions: Tlh97 and blckgrffn
Feb 4, 2009
34,494
15,729
136


So the 128 EU DG2 design is: 1024 ALUs, 32 ROPs, 96 bit GDDR6 bus.

While I know this is an entry level part could someone dumb this down, is this good news or bad news?
Does the presence of 6GB for an entry level card mean the performance oriented card will carry 12GB or maybe more?
 

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
So much for "Intel DGx will make great mining cards". With that limited bus width, I'm not seeing it, unless these cards come in at a $150 price-point.
Good. I hope Intel keeps disappointing miners in one way or another. It would be nice to have ONE vendor catering to just the gaming market. That said, if Intel succeeds, I don't see them being competitive in the mid-high end for another 5 years. It takes time to get good at developing drivers that support a wide range of games at performance levels gamers expect.
 
  • Like
Reactions: ryan20fun

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
So much for "Intel DGx will make great mining cards". With that limited bus width, I'm not seeing it, unless these cards come in at a $150 price-point.

128EU/1024SP/32ROP @ 2+GHz @ 65W TDP w/6GB of memory actually sounds like a fairly decent entry level card at $150. Even with the somewhat limited memory bus, it should be in GTX1650 super territory.

Buuut, that price is unlikely. To say the least.
 
  • Like
Reactions: beginner99

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
It takes time to get good at developing drivers that support a wide range of games at performance levels gamers expect.

Yeah. It's the real battle intel will face. I have no doubts they can make a useable GPU for playing modern games. But will they have all the needed fixes for 5 year old games? 10 year old games? I still play Civ 5 and form a financial perspective it makes little sense for intel to invest in supporting older games. It's not about ultimate performance in such old games but that they don't have glitches or outright crash. As far as I understood from that one post here couple years ago every single game needs some custom driver fixes to make it work.
 

Glo.

Diamond Member
Apr 25, 2015
5,657
4,409
136

Rough count for 128 DG2: Pixel Fillrate: 70.4 GP/s at 2.2 GHz Texture Fillrate: 153.6 GT/s at 2.2 GHz Memory Bandwidth 192 GB/s Those are numbers similar to: GTX 1650 Super/1660/1660 Super/1660 Ti, RX 5500 XT.

This GPU should be at least on the same level as GTX 1650 Super, but will not be limited by its VRAM capacity. IMO, this may touch 1660 non-super in most games.

Also, keep in mind, this is important for the throughput of the GPU - it can register 2 triangles per clock, while, IIRC, GTX 1650 Ti - one.

Its kinda weird design, overbuilt in one area, while also hindered in other.

I BLOODY LOVE IT.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
...and form a financial perspective it makes little sense for intel to invest in supporting older games. It's not about ultimate performance in such old games but that they don't have glitches or outright crash. As far as I understood from that one post here couple years ago every single game needs some custom driver fixes to make it work.

I think it's the other way round. If they can get older games to work, Intel has a real winner on their hands. These will sell like hot cakes.

As an anecdote, I've got plenty of 20+ year old games working just fine on my laptops IGP. Of course there will be edge cases, but I don't think they'll have to do a lot of donkey work.
 
Feb 4, 2009
34,494
15,729
136



This GPU should be at least on the same level as GTX 1650 Super, but will not be limited by its VRAM capacity. IMO, this may touch 1660 non-super in most games.

Also, keep in mind, this is important for the throughput of the GPU - it can register 2 triangles per clock, while, IIRC, GTX 1650 Ti - one.

Its kinda weird design, overbuilt in one area, while also hindered in other.

I BLOODY LOVE IT.

This is kind of what I meant earlier. Is this good news?
Does 6GB on a low end card mean 12GB on a high end card?
will the two triangles per clock scale with a more powerful card or maybe the under build area will be overbuilt on the halo product.
Personally I am confused and excited by how little intel has released.
 
Feb 4, 2009
34,494
15,729
136
I think it's the other way round. If they can get older games to work, Intel has a real winner on their hands. These will sell like hot cakes.

As an anecdote, I've got plenty of 20+ year old games working just fine on my laptops IGP. Of course there will be edge cases, but I don't think they'll have to do a lot of donkey work.

but gamers like to complain, especially about new stuff.
Amazons MMO bricking cards oh noes! Amazon sucks! When in reality it was one or two dozen cards from one manufacturer that had defects on the solder.
Coil whine is another great example. Never heard of this problem until it got a cool name “coil whine” then suddenly everyone has or has had a card with it...
I can guarantee there will be a major scandal because the colors are off in a game made in 1992 called Mr. Pinky’s Adventure.
 
  • Like
Reactions: Tlh97 and ryan20fun

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Coil whine is another great example. Never heard of this problem until it got a cool name “coil whine” then suddenly everyone has or has had a card with it...

Coil whine is very real. I too didn't really believe the stories, until I heard an example myself.

Kind of like how you can hear an old fashion tube TV is on several metres away. Or I can at least, and I'm perfectly willing to do a blind test for what that is worth.

I can guarantee there will be a major scandal because the colors are off in a game made in 1992 called Mr. Pinky’s Adventure.

Oh, undoubtedly.

(Checks online catalogues if there was indeed a 1992 game named Mr. Pinky's Adventure. Because stranger things have happened)