Don't think it mattered at this point.
So, who's planning on Aping up into Intel hype train, just like me and buy that hopium they sell?
Who's planning on building profanum of a PC: AMD CPU and Intel dGPU?
So, who's planning on Aping up into Intel hype train, just like me and buy that hopium they sell?
Who's planning on building profanum of a PC: AMD CPU and Intel dGPU?
🙂
Jokes aside, how's current state of crypto mining algorithms on Intel GPUs?
Still doesn't work, at all?
Intel can't really make a splash if they require Samsung or TSMC to get their GPUs fabricated. They'd still compete for wafers with AMD and NVidia, among other companies needing fab time and space. On the other hand, your conjecture about bundle deals makes a lot more sense and would incentivize purchases for their Alderlake hardware. No idea what their AIB partners will be like if any, but I can see bundle deals being the best way to get product moving and make a dent in AMD and NVidia's side. This is all dependent on ARC being good and well 12th gen Alderlake not being a steaming pile of doo.Rumors also talk about those GPUs being cheap because Intel wanting to make a splash entry into the market, and flooding it with GPUs, and great value.
P.S. I would not be surprised if we would see CPU+GPU+Mobo bundles from Intel.
They would make possible getting GPUs at MSRPs this way...
Ian Cutress making his own youtube channel and not cross posting on AT, speaks for itself on how far anandtech has gone downminus the forums.
FTFYHe seems more interested in doing fluff interviews andpostingdebating on Twitter anyway, so its not exactly like he's putting out quality content that trounces what's showing up on AT proper.
I like this graph because it's RGB! xDIntel seems to expect they can compete up to RTX 3070/6700XT with their fastest SKU.
![]()
Leaked slide shows Intel DG2 (Arc Alchemist) GPUs compete with GeForce RTX 3070 and Radeon RX 6700XT - VideoCardz.com
Intel Arc Alchemist to compete in 100 – 500 USD price range Intel DG2 GPUs are set to compete with NVIDIA’s upper mid-range segment officially, according to this leaked slide. Should the slide be real, Intel is clearly not planning to compete with NVIDIA GA102 and AMD Navi 21 GPUs in terms of...videocardz.com
It's also nice to see they are targeting the lower end range with a 75W SKU which is probably the 128EU version. Videocardz says it's either DG2-384 or DG2-256 but I don't think it is. There are only two dies as far as we know, 512 and 128 EUs which is I believe what they mean with SOC1 and SOC2, so it makes more sense Intel is using the smaller die for this and not a cut down 512 version.
Intel seems to expect they can compete up to RTX 3070/6700XT with their fastest SKU.
![]()
Leaked slide shows Intel DG2 (Arc Alchemist) GPUs compete with GeForce RTX 3070 and Radeon RX 6700XT - VideoCardz.com
Intel Arc Alchemist to compete in 100 – 500 USD price range Intel DG2 GPUs are set to compete with NVIDIA’s upper mid-range segment officially, according to this leaked slide. Should the slide be real, Intel is clearly not planning to compete with NVIDIA GA102 and AMD Navi 21 GPUs in terms of...videocardz.com
It's also nice to see they are targeting the lower end range with a 75W SKU which is probably the 128EU version. Videocardz says it's either DG2-384 or DG2-256 but I don't think it is. There are only two dies as far as we know, 512 and 128 EUs which is I believe what they mean with SOC1 and SOC2, so it makes more sense Intel is using the smaller die for this and not a cut down 512 version.
I think that Intel is missing an opportunity for product synergy here. They have the perfect product to pair with their dGPUs, Optane!
Hear me out, we know that putting a ton of very high speed ram on graphics cards gets expensive quite quickly in both cost and energy draw. Why not enhance their product with something that they already have but can't seem to find a good market for? If they integrated a few Optane chips onto their cards, they could have a pool of 64+GB of second class vram to use as "near" storage for their cards. This would allow the massive levels in modern games to be preloaded before game execution and be drawn from directly by the GPU during game play.
This is not without precedent. AMD did something similar with two cards that integrated SSDs on them to give massive VRAM buffers for professional applications. The main drawbacks with that approach are the response time of the SSDs and the fact that they are subject to rapid wear. Optane improves upon both of those issues by having a very quick (for an SSD) response time AND having an extremely high endurance rating. It's main hangups are cost per GB and relatively low density. Since we're only talking about 64-128GB, that cost won't be extreme, and modern Optane chips are dense enough to meet that need with very few packages.
I'm not proposing that the gpu render directly from the Optane, I'm suggesting that it can be used to rapidly swap in chunks of data from the VRAM of the card without having to go out to the PCIe bus of the PC, instead, it can use it's own private one to the Optane itself with zero contention and rapid setup times.
It seems like it could be a decent match.
There is no PL2 or short boost on a graphics card, it wouldn't make sense. You want fps consistency while game, nobody want big fps drops after 60 seconds. Game tests are a lot longer than a Cinebench run. The drops on AMD/Nvidia cards are very small therefore.
I like this graph because it's RGB! xD
Uh, perhaps you haven't noticed - but the forums have plummeted over the past 10 years. We used to have professionals in CPU/GPU design, process development, etc. in the past. For various reason, they have moved on (except DMENS, but he mostly snipes at Intel without giving much detail).Ian Cutress making his own youtube channel and not cross posting on AT, speaks for itself on how far anandtech has gone down minus the forums.
We used to have professionals in CPU/GPU design, process development, etc. in the past. For various reason, they have moved on (except DMENS, but he mostly snipes at Intel without giving much detail).
Uh, perhaps you haven't noticed - but the forums have plummeted over the past 10 years.
Uh, perhaps you haven't noticed - but the forums have plummeted over the past 10 years. We used to have professionals in CPU/GPU design, process development, etc. in the past. For various reason, they have moved on (except DMENS, but he mostly snipes at Intel without giving much detail).
I've gotten to the point that I believe that the gpu market is about to transition into the asic market for Bitcoin. Essentially, manufacturers will only bother to cater to the miners, as that's where the bulk of their volume goes, without having to even bother to market. Why would you even bother to cater to gamers, who will demand warranty service, bios fixes, etc, when you can make easy bulk sales to miners?
The two GPU companies were both badly burned by excess mining inventory before. In the last quarterly report AMD said they are not heavily "exposed" to crypto. The word choice is telling: they see it as a risk.When has a modern company EVERY taken the long game on ANYTHING? There is a boat load of short term dollars to be made by catering to miners. The conversion costs to move between mining cards and gaming cards are minimal.