News Intel GPUs - waiting for B770

Page 257 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

511

Diamond Member
Jul 12, 2024
4,510
4,125
106
Nvidia Basically Killed Intel dGPU Efforts permanently in client.
Now for the IP If Intel has any sense left they will continue to improve it.
 

DZero

Golden Member
Jun 20, 2024
1,623
629
96
Nvidia Basically Killed Intel GPU Efforts.
And seeing how CPU and fab are currently, seems that Intel is on the verge to be left way behind.

Also, don't forget Apple. Their A19 Pro GPU is on par with the GTX 1650M, which is weird, but remember... is a phone chip. Using this chip on a Mac (which is on plans according to this rumours) that allows to run even better the processor, is a critical hit for Intel iGPU which consumes even more with a little more performance.
And... that's if is ONLY the Mac Books. If Apple released a cheaper Mac Mini with said phone processor, it will be a BIG problem for Intel CPU and GPU wise.
 

511

Diamond Member
Jul 12, 2024
4,510
4,125
106
And seeing how CPU and fab are currently, seems that Intel is on the verge to be left way behind.

Also, don't forget Apple. Their A19 Pro GPU is on par with the GTX 1650M, which is weird, but remember... is a phone chip. Using this chip on a Mac (which is on plans according to this rumours) that allows to run even better the processor, is a critical hit for Intel iGPU which consumes even more with a little more performance.
And... that's if is ONLY the Mac Books. If Apple released a cheaper Mac Mini with said phone processor, it will be a BIG problem for Intel CPU and GPU wise.
Intel has not even released a new IP on new node you are comparing a N3P latest GPU IP to 1-2 Year old IP on N3B.
Let Celestial come before making the decision.

Even Arc 140V beats 1650M in Time Spy.
 

DZero

Golden Member
Jun 20, 2024
1,623
629
96
Intel has not even released a new IP on new node you are comparing a N3P latest GPU IP to 1-2 Year old IP on N3B.
Let Celestial come before making the decision.
I didn't mention Apple M5, which targets Intel and AMD, that would be a true carnage. And I compared to a notebook chip that is old, but shows how the Apple phone chips got evolved.
 

511

Diamond Member
Jul 12, 2024
4,510
4,125
106
I didn't mention Apple M5, which targets Intel and AMD, that would be a true carnage. And I compared to a notebook chip that is old, but shows how the Apple phone chips got evolved.
I am not worried about the GPU part Xe3 is big upgrade as an uArch it should be good enough to hold its own against base M5.
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,516
1,357
136
Jul 27, 2020
27,954
19,101
146
It will be in the 4070-4070S range so $400-430 likely price tag.
No one will buy it at that price over even a plain 4070 unless they really, really hate Nvidia or have a room in their home dedicated to all things Intel. Performance doesn't matter. They are the No.3 player. With that comes the expectation that it will be cheap since they don't offer anything that's enticing enough to ignore Nvidia or AMD. That price would've made sense if it had 24GB VRAM or heck people would pay up to $550 for 32GB VRAM.
 
  • Like
Reactions: ToTTenTranz

DavidC1

Golden Member
Dec 29, 2023
1,833
2,958
96
iGPU in their Pentiums maybe, it's very likely that Nvidia wants the juicy part of the steak.
Even more... if nVIDIA licences their dGPU in lower tiers, it might be even over on iGPU too, except maybe Intel -N
There's NO difference to Intel whether they keep it for Pentium only or N only or keep all iGPUs. The fixed cost is always there. The only way what you guys are suggesting makes sense is if Intel use older generation architectures.

Now I'm not saying Intel dGPU is for 100% dead. This hinges on how well Intel GPU team does plus if Nvidia joint venture is successful. But, the first has to be VERY successful, and the second would mean total death. No Intel graphics ever. I assume 2027 is the end so.

Intel graphics: 1998-2027 RIP
 

regen1

Member
Aug 28, 2025
86
141
61

Outside of Jaguar Shores, the company has an additional, unannounced GPU design with a lower power requirement for servers on its road map that could arrive next year at some point, according to a source familiar with the company’s plans.

Think he is referring to a specific GPU targeted for inference which has been in the works for quite some time now.
 

regen1

Member
Aug 28, 2025
86
141
61

A report by CRN revealed that Intel plans to release an AI chip with low power consumption alongside Jaguar Shores, which will likely target the inference markets.
How did Wccftech conclude Jaguar Shores releasing next year ?
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,012
32,464
146
Testing with a 3700X Spidey Remastered was fully playable without RT. Turning it on is where the wheels come off. Which is a bummer, because with a fast enough CPU it is capable of using some RT on a $250 card. Steve is stuck on the mentality that RT is not a feature at the price point. He usually tests with either all RT features on or off. Turning on just RT reflections in Spidey games is a significant eye candy boost, even without window licking. Remastered is the least impacted by leaving it off, with reflections progressively making a bigger difference in visuals as the series goes on.

In the end I would not recommend it over the 9060XT for gaming. The only pro is extra VRAM. The CPU overhead is exacerbated when you have to use 3rd party apps for recording and streaming. Steve mentioned the 5600 was 3 years old. Technically correct, but misleading. It's a downclocked 5 year old CPU. I think part of the performance issues with the 2600 in some titles may be the card running gen 3 x8.

EDIT: Should give props to the ARC team though. They are always hustling and improving problematic games for A and B series month after month. More impressive yet when taking into account the mass layoffs going on.
 
Last edited:
Jul 27, 2020
27,954
19,101
146
Good to know that they are serious about fixing the issues. Bodes well for future products.
I just thought of something. Maybe all Intel's driver team did was change the compiler version? That would explain not mentioning the performance increase in the changelog because even they probably weren't aware that the new compiler version improved performance on older architectures?