• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

News Intel GPUs - we've given up on B770, where's Celestial already

Page 46 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
That makes a lot more sense. So roughly the same size as GA104.

Yes, package sizes are usually quoted a mm x b mm, while die sizes are shown as c mm2.

Also you can just tell. Package sizes are generally way out of range so just by that you can figure out whether it's talking about the die or package. Of course you'd compensate for other variables like whether it's a desktop CPU or a one that'll go in smartphones. Lakefield for example is only 12mm x 12mm for the package size, then you need to do additional search.

Many tech sites make a mistake of not distinguishing between the two(probably because they don't understand it that well).

I think while I believe Anandtech articles have deteriorated in quality over the years I appreciate them not putting up speculation/rumor as part of their news feed.
 
Friends, school me if I'm wrong, but my understanding is that Intel is 'deflating their footballs'. Raja Kadouri may be brilliant and he knows computer graphics very, very well, but Intel's upcoming GPUs will only work optimally with Intel CPUs and chipsets but not nearly as well on Ryzen platforms. If this is true, it's a massive gyp. Imagine buying a Toyota Corolla only to realize you must use their special blend of gas or their specific tires to run well. Wouldn't that piss you off?
 
Friends, school me if I'm wrong, but my understanding is that Intel is 'deflating their footballs'. Raja Kadouri may be brilliant and he knows computer graphics very, very well, but Intel's upcoming GPUs will only work optimally with Intel CPUs and chipsets but not nearly as well on Ryzen platforms. If this is true, it's a massive gyp. Imagine buying a Toyota Corolla only to realize you must use their special blend of gas or their specific tires to run well. Wouldn't that piss you off?

There really hasn't been much on rumors of actual gaming performance in general. Far to early to figure out if there is anything you suggest.
 
Friends, school me if I'm wrong, but my understanding is that Intel is 'deflating their footballs'. Raja Kadouri may be brilliant and he knows computer graphics very, very well, but Intel's upcoming GPUs will only work optimally with Intel CPUs and chipsets but not nearly as well on Ryzen platforms. If this is true, it's a massive gyp. Imagine buying a Toyota Corolla only to realize you must use their special blend of gas or their specific tires to run well. Wouldn't that piss you off?
That's all unlikely. Intel will not do that. What gave you that idea? DG1 laptop sales requirements?
 
Friends, school me if I'm wrong, but my understanding is that Intel is 'deflating their footballs'. Raja Kadouri may be brilliant and he knows computer graphics very, very well, but Intel's upcoming GPUs will only work optimally with Intel CPUs and chipsets but not nearly as well on Ryzen platforms. If this is true, it's a massive gyp. Imagine buying a Toyota Corolla only to realize you must use their special blend of gas or their specific tires to run well. Wouldn't that piss you off?
Why would it not work with AMD CPUs?

Does it have special SIMDs to gimp performance of actual Intel GPUs? Drivers would gimp performance on AMD platform?

That would only harm Intel, not AMD. Can you imagine a backlash that people would grill Intel, with?

If Intel is serious about dGPUs(which, they are) they will go all out, regardless of platform.

P.S. I'm gonna tell you guys right now. The actual combo of AMD CPU and Intel dGPU is what I am gonna build next year.
 
There really hasn't been much on rumors of actual gaming performance in general. Far to early to figure out if there is anything you suggest.

I heard that MLID says the drivers are "horrible" (worse than VEGA was 6 months before launch) and performance is around 3060.
 
Why would it not work with AMD CPUs?
Why didn't Intel's then-premier AX200 Wifi 6 m.2 card, work with anything besides intel 10th (and now, 11th)-gen CPUs and their associated chipsets?

It wouldn't surprise me in the least if Intel's new dGPUs only function in an all-Intel system.
 
Why didn't Intel's then-premier AX200 Wifi 6 m.2 card, work with anything besides intel 10th (and now, 11th)-gen CPUs and their associated chipsets?

It wouldn't surprise me in the least if Intel's new dGPUs only function in an all-Intel system.

Yeah, people are keep saying "Intel wouldn't do that". First of all, they already did with DG1 and second of all Intel totally loves doing exactly that with everything they make. Its a bad idea, but it always been a bad idea and they keep doing it anyway. So you really can't dismiss it out of hand.
 
Why didn't Intel's then-premier AX200 Wifi 6 m.2 card, work with anything besides intel 10th (and now, 11th)-gen CPUs and their associated chipsets?

Eh wot

My x570 motherboard has an AX200 built into it. It works, though the AX200 itself is buggy. It's also buggy in Intel machines.
 
Why didn't Intel's then-premier AX200 Wifi 6 m.2 card, work with anything besides intel 10th (and now, 11th)-gen CPUs and their associated chipsets?

It wouldn't surprise me in the least if Intel's new dGPUs only function in an all-Intel system.

That's because those are actually AX201s which are CNVi and only exist on Intel chipsets. You could say the AX201s give Intel and advantage because having some of its functions on a chipset saves real estate and cost but that's a different story.

The AX200 with everything, including AMD. I use the AX200 on a Broadwell laptop. Works great.

First of all, they already did with DG1 and second of all Intel totally loves doing exactly that with everything they make. Its a bad idea, but it always been a bad idea and they keep doing it anyway. So you really can't dismiss it out of hand.

DG1 doesn't count. DG1 isn't artificial segmentation, it just lacks a video BIOS chip that let's it boot without a specific chipset requirement.

I assume for DG1 they wanted to get a "discrete GPU" out fast so they hacked it quick and dirty.

It doesn't mean they can't artificially segment it to work with on their GPUs but hopefully they are not that stupid. I know that they know as a newcomer you need it to be open source and work with everyone.

They are great with Linux open source for example.
 
DG1 was a test run and only used in certain OEM builds. Obviously Arc cards do not have those restrictions (special OEM stuff aside).
 
Jokes aside, how's current state of crypto mining algorithms on Intel GPUs?

Still doesn't work, at all?

From what I have read, there were a few ways to get some OpenCL-based mining software to work on Intel iGPUs if the software specifically targeted Intel iGPUs. Standard OpenCL software targeting an inspecific OpenCL-compliant iGPU/dGPU didn't run. Intel did something with their drivers in 2020 that broke all of that software. So if you want to mine on an Intel iGPU today, you can, but only if there are older drivers available for it.
 
Jokes aside, how's current state of crypto mining algorithms on Intel GPUs?

Still doesn't work, at all?

Don't think it mattered at this point. What would you mine on an iGPU? Without high bandwidth dedicated VRAM crypto mining is pointless.

You can mine on AMD iGPU because its a derivative of the dGPU part. Since it uses the same drivers and software it's supported simple as that.

Not that it matters. Back when the DAG file was small enough you could get 3-4MH/s @ 50W? iGPUs are efficient for gaming but on crypto mining it falls completely flat on it's face.
 
Yeah those measurements have to be off. It would make the die close to 1,600 mm^2 which would be larger than anything ever released by either AMD or Nvidia by about double.

You also wouldn't be able to fit very many of them on a wafer. Probably fewer than 30 from napkin math.
 
Back
Top