News Intel GPUs - Intel launches A580

Page 154 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136

150W DG2 is coming. Something that will be between A700 and A 300 series.

SOC3, with 256 EUs and 192 bit bus? Would fit that 6 GB VRAM.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136

150W DG2 is coming. Something that will be between A700 and A 300 series.

SOC3, with 256 EUs and 192 bit bus? Would fit that 6 GB VRAM.

The roadmap and the intro date for Alchemist suggests:
-Summer for SOC3. Maybe June/July
-Sep/October for Alchemist+ low end
-Very late 2023/very early 2024 for Alchemist+ higher end
-July/August 2024 for Battlemage

Never use the leftmost edge for roadmaps, that's only important for the manufacturer and builders. For consumers it's some time after.
 

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136
The roadmap and the intro date for Alchemist suggests:
-Summer for SOC3. Maybe June/July
-Sep/October for Alchemist+ low end
-Very late 2023/very early 2024 for Alchemist+ higher end
-July/August 2024 for Battlemage

Never use the leftmost edge for roadmaps, that's only important for the manufacturer and builders. For consumers it's some time after.
I have not once said when it is coming, only that it is coming :)
 
  • Like
Reactions: IntelUser2000

adamge

Member
Aug 15, 2022
48
126
66
In his RTX 4070TI video yesterday, MLID made an interesting statement that his predictions on Intel's GPU future have come true. I'm inclined to interpret this as twisting what he said in the past to make it seem like it aligns with the latest developments.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,719
7,016
136
In his RTX 4070TI video yesterday, MLID made an interesting statement that his predictions on Intel's GPU future have come true. I'm inclined to interpret this as twisting what he said in the past to make it seem like it aligns with the latest developments.

- What was his prediction?
 

QueBert

Lifer
Jan 6, 2002
22,378
708
126
DAAMIT, I watched a few minutes of that video and didn't pay attention to the date it was uploaded. I thought it was breaking news and my dreams of buying an ARC had possibly been smashed. I have no idea who that guy is so I Googled is ARC had been canceled, nothing new then I saw the date of the video.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
DAAMIT, I watched a few minutes of that video and didn't pay attention to the date it was uploaded. I thought it was breaking news and my dreams of buying an ARC had possibly been smashed. I have no idea who that guy is so I Googled is ARC had been canceled, nothing new then I saw the date of the video.

Don't listen to him. Also RGT. Actually sometimes they put up legit Intel presentations but look at that and ignore what they say.

When it comes to MLID and ARC it seems like he's been disproven so he doubles down on the error. Almost like he believes if enough people believe in it they'll stop buying which will cause Intel to cancel it, kind of a twisted self fulfilling prophecy.

No matter what he says the future is in Intel's hands. Gelsinger said personally that GPU was one thing he wasn't able to do, so he has a reason to continue.

Another thing is if they continue to improve and Battlemage is successful then they would continue. Sometimes, companies are surprised by the success of the product they are selling, even in cases where they weren't big on the product, they push development because they see the money.

They already have equal volume share to AMD in mobile and 40% of AMD's dGPU marketshare and that's despite the troubles they went through so it's not the worst for them.
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
MLID made an interesting statement that his predictions on Intel's GPU future have come true.
His prediction was never clear to begin with, it was a mashup of possible scenarios. According to him "ARC is over" and "Battlemage is only low end+midrange" are equal predictions, which cannot be further from the truth.

I said it before regarding the ARC leak: it's one thing to leak tech specs or upcoming product names, another beast to announce high level exec decisions that may have immediate impact on stock price and shareholder interests. Someone telling us in advance about ARC cancellation would be risking their job and maybe even a lawsuit, so the chances of one individual doing this for the lulz are slim to none. Yet MLID claimed to have 4 such people on his comm channels.

 

mikk

Diamond Member
May 15, 2012
4,112
2,108
136
Many of the Intel GPU predictions from MLID were wrong. Remember the A780? Several Intel employees told such SKU was never worked on or planned. Now he hypes AMD like crazy and Intel is the bad one for him. MILD never can accept he was wrong on something. I think RGT is more reliable than him when it comes to Intel GPUs. The Battlemage roadmap looks legit, although a little dated.
 
  • Like
Reactions: Leeea and Ranulf

CakeMonster

Golden Member
Nov 22, 2012
1,384
482
136
Blocktubed ages ago, so I can't get MLID rolled. Just a blissful black screen here. :D
Hah, same, I was wondering for a second until I saw your comment. Sometimes you have to address bad actors because they make enough noise, but in this case blocking and ignoring should be enough, and just shut down any mention fast and move on to factual discussion of the topic.

Even in the worst of cases I expect Intel to iterate on their architecture, because they need to maintain the iGP and established techs like Quick Sync, but given everything that's happening in AI as well as the very fluid situation with regards to the hardware landscape it would be suicidal not to be ready to pounce with architectures for productivity, gaming, and professional workloads. If any major breakthroughs or new usage patterns emerge, or the Taiwan situation goes down badly, Intel could stand to benefit immensely with their own fabs and GPU architectures.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,647
3,706
136
Hah, same, I was wondering for a second until I saw your comment. Sometimes you have to address bad actors because they make enough noise, but in this case blocking and ignoring should be enough, and just shut down any mention fast and move on to factual discussion of the topic.

Even in the worst of cases I expect Intel to iterate on their architecture, because they need to maintain the iGP and established techs like Quick Sync, but given everything that's happening in AI as well as the very fluid situation with regards to the hardware landscape it would be suicidal not to be ready to pounce with architectures for productivity, gaming, and professional workloads. If any major breakthroughs or new usage patterns emerge, or the Taiwan situation goes down badly, Intel could stand to benefit immensely with their own fabs and GPU architectures.

I wish there was more info out there on Quicsync. I thought it was h264 only. Now I see it supports AV1? I'm guessing they skipped HEVC because of royalties?
 

mikk

Diamond Member
May 15, 2012
4,112
2,108
136
I wish there was more info out there on Quicsync. I thought it was h264 only. Now I see it supports AV1? I'm guessing they skipped HEVC because of royalties?


Nobody skipped HEVC, Intel supports h264, VP9, h265, AV1. What info do you want?
 
  • Like
Reactions: Thunder 57

Thunder 57

Platinum Member
Aug 19, 2007
2,647
3,706
136
Nobody skipped HEVC, Intel supports h264, VP9, h265, AV1. What info do you want?

Nice. I just wish CPU reviews would include it like Anand did in the Sandy Bridge review. All the more reason not to save a few bucks on an F SKU.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
VR on ARC A770 16GB first look with HP Reverb G2 headset: https://babeltechreviews.com/first-look-at-arc-vr-performance/

TLDR: ARC A770 wins four out of six benchmarks but frame delivery sucks big time due to unevenness. DO NOT use for VR at this time.

Also, Babel Tech Reviews has been acquired by Jon Peddie Research (note at bottom of article)

They said that proper VR support is in the works but some time away. One said 3-6 months back in November or something.

They need to get the DX11 fix they eluded to by Raja first. They had to write a native DX9 code(not necessarily DXVK), so it sounds like lot of work. I know it's not just some popular targetted games. Really old benchmarks like 3DMark03 benefits with it.

DX9 performance still has a lot of room to grow though. Their own benchmarks show 1440p showing no loss in performance for LoL and Stellaris. That means CPU bottleneck in 1080p, but in this case it's due to too many draw calls.
 
Jul 27, 2020
15,756
9,821
106

Acer Bitfrost card discounted to $349.99

From one user review:

The a770 Bifrost is nice, when it is working without issues. It isn't double stick taped together like it's counterpart, and has better cooling and performance. On Win 11, It easily manages 60FPS+ at 4k, and rockets Firestrike upwards of 200FPS with a score of 26000+, and if your game won't run on it, odds are simply adding the correct DXVK Vulkan driver in the game folder will have it playing smoothly without issues. But for some HELLISH reason, it functions better in my Ryzen rig, gaining over 3K total score on Firestrike, as well better perf in others as well.

The irony!
 
  • Like
Reactions: DAPUNISHER

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
By the way, some users with lower end CPUs like Ryzen 4xxx APUs report noticeably lower performance in games like Cyberpunk, despite it being said to be well optmized.

In Xe2, they are changing the Xe core config from dual subslice, 8x 256-bit per subslice to single 8x512-bit.

Since there's a Thread Control unit and Register file per Xe core, I am going to speculate the change is going to reduce register pressure and improve performance of the Thread Control unit's ability to hand off workload to the EUs.

The problem in games like Cyberpunk and lower end CPUs likely have to do with high driver overhead. The Xe2 changes might help further in that regard but it shows that even DX12 drivers have lots of room to improve. Not so much for 13600K+ users mind you but low end systems.

Acer Bitfrost card discounted to $349.99

Lots of people like the look of the Intel reference cards, but the double sided tape thing is so stupid.

Can't believe that rather than reducing the amount of screws needed they decided to get rid of most of them, like it's somehow of a benefit. Could have made it magnetic or something. It isn't Premium, it's "Premium".

I know most of the reference models have way too many screws while third party models get away with having less than 10. They could have done the same thing.
 
Last edited:

Aapje

Golden Member
Mar 21, 2022
1,313
1,773
106
If double sided tape is strong enough, you don't need a lot of screws to replace it.
 
Jul 27, 2020
15,756
9,821
106
If double sided tape is strong enough, you don't need a lot of screws to replace it.
But how well does it handle the heat of extended gaming sessions over a long period of time?

And it's a big annoyance when it comes time to repaste these cards.