• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

News Intel GPUs - we've given up on B770, where's Celestial already

Page 97 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
For anyone hoping Intel's drivers will duplicate AMD's fine wine effect, I remind you how Intel fixes security flaws:

Starting from Intel's graphics driver version 15.40.44.5107, applications that run exclusively on DirectX 12 API no longer work with GPUs integrated into Intel's 4th Generation Core processors as well as Celeron and Pentium chips powered by the Haswell architecture.
 
Ruby is AMD's doll. Intel will have to make a new one.

And she shall be named after some blue gemstone that isn't already copyrighted.

I am not sure AMD ever used Ruby themselves, Ruby was ATI era. Though AIBs kept Ruby on GPU packaging well into the 2010s. My Sapphire 7950 had Ruby on the box.
 
GPU computing used for AI generation of cute digital girls. Intel might have to resort to that to get all the lonely fat teens interested.

Mom: "Matt!!!! What are you doing???? Dinner's getting cold!"

Matt: "Coming Mom!"

(Matt clicks the button to generate one more girl)
 
It seems exactly like the Xe IGPs: good benchmark scores, subpar in games.

I suspect they either overfit their tests or the driver replaces shaders for benchmarks. Stranger things have happened.
 
Despite only being launched in China Arc A380 seems to be plenty successful in creating hot air.
What's the prevailing wind direction?

Could delivery via hot air balloons be a viable strategy?

Bonus if it's slow as that gives the driver team more time.

Had come into this thread expecting news now there has been a release of sorts, but it was all highjacked by anime talk so maybe NTMBK should rename the thread again!
 
Intel's NUCs aren't affordable. But at least, they will support it with drivers for a couple of years, if not more. There is hope for a modicum of success, even though it's probably microscopic from the looks of it.
 
For anyone hoping Intel's drivers will duplicate AMD's fine wine effect, I remind you how Intel fixes security flaws:

Just saying, but didn't AMD tell people to just shut down half of the CPU they payed a lot of money for (use game mode) because it was performing badly in games?!
If that was the FX platform you thing AMD would have given even half a duck about fixing it?!
But Intel doesn't understand economy of scale I guess. Wants to make the quick buck.
No matter how much they scale this up it will always be more expensive to make than the bigger versions and people are always cheap, history is full of much better but more expensive tech that is now forgotten.
 
Because of how GPUz has to measure power, it requires an update before it can properly see power usage anytime a new CPU/GPU comes out. It basically has to extrapolate its values, because not every chip reports current draw the same way. Intel's own reporting should be accurate, provided they aren't fibbing.


Yes indeed. There are two more tests, expreview tested the consumption directly on the PCIe slot and run 3dmark timespy. Spike up to 94.5W and slightly over 80W average. Idle consumption 20W, this is too high. Another weakness.




In some of the higher game settings RX6400 struggles (I guess 4GB is the culprit).

This wasn't known:

The TBP of the GPU is configurable between 75W to 87W with the clock speed correspondingly configurable between 2 GHz to 2.35 GHz.

20lej6o.png

 
That design option crap is annoying. How is the buyer supposed to differentiate between the three different clocked versions? A naming scheme like A380L, A380 and A380H could have helped.

That's a general problem with laptops. CPU's can also be set up with a different TDP and you often don't get told, so you don't know how good the CPU will actually perform.
 
Yes indeed. There are two more tests, expreview tested the consumption directly on the PCIe slot and run 3dmark timespy. Spike up to 94.5W and slightly over 80W average. Idle consumption 20W, this is too high. Another weakness.

Was that measured with a single display or multi-display? If thats just a single display, that is crazy high compared to even higher end GPUs at idle. Even a 3080 will idle at less than 10W.
 
Back
Top